[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11701 1727096115.23187: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11701 1727096115.24030: Added group all to inventory 11701 1727096115.24032: Added group ungrouped to inventory 11701 1727096115.24037: Group all now contains ungrouped 11701 1727096115.24040: Examining possible inventory source: /tmp/network-EuO/inventory.yml 11701 1727096115.51022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11701 1727096115.51094: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11701 1727096115.51117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11701 1727096115.51183: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11701 1727096115.51264: Loaded config def from plugin (inventory/script) 11701 1727096115.51266: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11701 1727096115.51309: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11701 1727096115.51405: Loaded config def from plugin (inventory/yaml) 11701 1727096115.51408: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11701 1727096115.51502: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11701 1727096115.51952: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11701 1727096115.51955: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11701 1727096115.51958: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11701 1727096115.51964: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11701 1727096115.51971: Loading data from /tmp/network-EuO/inventory.yml 11701 1727096115.52044: /tmp/network-EuO/inventory.yml was not parsable by auto 11701 1727096115.52110: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11701 1727096115.52155: Loading data from /tmp/network-EuO/inventory.yml 11701 1727096115.52248: group all already in inventory 11701 1727096115.52256: set inventory_file for managed_node1 11701 1727096115.52260: set inventory_dir for managed_node1 11701 1727096115.52261: Added host managed_node1 to inventory 11701 1727096115.52264: Added host managed_node1 to group all 11701 1727096115.52265: set ansible_host for managed_node1 11701 1727096115.52266: set ansible_ssh_extra_args for managed_node1 11701 1727096115.52271: set inventory_file for managed_node2 11701 1727096115.52274: set inventory_dir for managed_node2 11701 1727096115.52275: Added host managed_node2 to inventory 11701 1727096115.52276: Added host managed_node2 to group all 11701 1727096115.52277: set ansible_host for managed_node2 11701 1727096115.52278: set ansible_ssh_extra_args for managed_node2 11701 1727096115.52281: set inventory_file for managed_node3 11701 1727096115.52285: set inventory_dir for managed_node3 11701 1727096115.52286: Added host managed_node3 to inventory 11701 1727096115.52287: Added host managed_node3 to group all 11701 1727096115.52288: set ansible_host for managed_node3 11701 1727096115.52289: set ansible_ssh_extra_args for managed_node3 11701 1727096115.52292: Reconcile groups and hosts in inventory. 11701 1727096115.52296: Group ungrouped now contains managed_node1 11701 1727096115.52298: Group ungrouped now contains managed_node2 11701 1727096115.52299: Group ungrouped now contains managed_node3 11701 1727096115.52380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11701 1727096115.52528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11701 1727096115.52579: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11701 1727096115.52608: Loaded config def from plugin (vars/host_group_vars) 11701 1727096115.52611: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11701 1727096115.52618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11701 1727096115.52626: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11701 1727096115.52676: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11701 1727096115.53289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096115.53556: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11701 1727096115.53598: Loaded config def from plugin (connection/local) 11701 1727096115.53601: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11701 1727096115.55030: Loaded config def from plugin (connection/paramiko_ssh) 11701 1727096115.55034: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11701 1727096115.57008: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11701 1727096115.57171: Loaded config def from plugin (connection/psrp) 11701 1727096115.57175: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11701 1727096115.58790: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11701 1727096115.58830: Loaded config def from plugin (connection/ssh) 11701 1727096115.58834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11701 1727096115.63173: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11701 1727096115.63215: Loaded config def from plugin (connection/winrm) 11701 1727096115.63219: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11701 1727096115.63366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11701 1727096115.63437: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11701 1727096115.63615: Loaded config def from plugin (shell/cmd) 11701 1727096115.63617: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11701 1727096115.63645: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11701 1727096115.63803: Loaded config def from plugin (shell/powershell) 11701 1727096115.63805: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11701 1727096115.63860: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11701 1727096115.64346: Loaded config def from plugin (shell/sh) 11701 1727096115.64348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11701 1727096115.64384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11701 1727096115.64617: Loaded config def from plugin (become/runas) 11701 1727096115.64619: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11701 1727096115.65019: Loaded config def from plugin (become/su) 11701 1727096115.65021: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11701 1727096115.65379: Loaded config def from plugin (become/sudo) 11701 1727096115.65381: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11701 1727096115.65416: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11701 1727096115.66190: in VariableManager get_vars() 11701 1727096115.66214: done with get_vars() 11701 1727096115.66485: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11701 1727096115.72776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11701 1727096115.73003: in VariableManager get_vars() 11701 1727096115.73008: done with get_vars() 11701 1727096115.73011: variable 'playbook_dir' from source: magic vars 11701 1727096115.73012: variable 'ansible_playbook_python' from source: magic vars 11701 1727096115.73013: variable 'ansible_config_file' from source: magic vars 11701 1727096115.73013: variable 'groups' from source: magic vars 11701 1727096115.73014: variable 'omit' from source: magic vars 11701 1727096115.73015: variable 'ansible_version' from source: magic vars 11701 1727096115.73015: variable 'ansible_check_mode' from source: magic vars 11701 1727096115.73016: variable 'ansible_diff_mode' from source: magic vars 11701 1727096115.73017: variable 'ansible_forks' from source: magic vars 11701 1727096115.73017: variable 'ansible_inventory_sources' from source: magic vars 11701 1727096115.73018: variable 'ansible_skip_tags' from source: magic vars 11701 1727096115.73019: variable 'ansible_limit' from source: magic vars 11701 1727096115.73019: variable 'ansible_run_tags' from source: magic vars 11701 1727096115.73020: variable 'ansible_verbosity' from source: magic vars 11701 1727096115.73056: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 11701 1727096115.74474: in VariableManager get_vars() 11701 1727096115.74608: done with get_vars() 11701 1727096115.74621: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11701 1727096115.76703: in VariableManager get_vars() 11701 1727096115.76719: done with get_vars() 11701 1727096115.76728: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11701 1727096115.76952: in VariableManager get_vars() 11701 1727096115.77083: done with get_vars() 11701 1727096115.77251: in VariableManager get_vars() 11701 1727096115.77264: done with get_vars() 11701 1727096115.77276: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11701 1727096115.77463: in VariableManager get_vars() 11701 1727096115.77638: done with get_vars() 11701 1727096115.78141: in VariableManager get_vars() 11701 1727096115.78155: done with get_vars() 11701 1727096115.78160: variable 'omit' from source: magic vars 11701 1727096115.78296: variable 'omit' from source: magic vars 11701 1727096115.78329: in VariableManager get_vars() 11701 1727096115.78340: done with get_vars() 11701 1727096115.78476: in VariableManager get_vars() 11701 1727096115.78489: done with get_vars() 11701 1727096115.78530: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11701 1727096115.79044: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11701 1727096115.79297: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11701 1727096115.80703: in VariableManager get_vars() 11701 1727096115.80728: done with get_vars() 11701 1727096115.81542: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11701 1727096115.81915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11701 1727096115.85443: in VariableManager get_vars() 11701 1727096115.85466: done with get_vars() 11701 1727096115.85482: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11701 1727096115.85804: in VariableManager get_vars() 11701 1727096115.85941: done with get_vars() 11701 1727096115.86226: in VariableManager get_vars() 11701 1727096115.86245: done with get_vars() 11701 1727096115.86902: in VariableManager get_vars() 11701 1727096115.86920: done with get_vars() 11701 1727096115.86925: variable 'omit' from source: magic vars 11701 1727096115.87063: variable 'omit' from source: magic vars 11701 1727096115.87103: in VariableManager get_vars() 11701 1727096115.87117: done with get_vars() 11701 1727096115.87137: in VariableManager get_vars() 11701 1727096115.87151: done with get_vars() 11701 1727096115.87404: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11701 1727096115.87518: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11701 1727096115.92452: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11701 1727096115.93294: in VariableManager get_vars() 11701 1727096115.93322: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11701 1727096115.97754: in VariableManager get_vars() 11701 1727096115.97779: done with get_vars() 11701 1727096115.97789: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11701 1727096115.98841: in VariableManager get_vars() 11701 1727096115.98864: done with get_vars() 11701 1727096115.99145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11701 1727096115.99159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11701 1727096115.99638: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11701 1727096115.99830: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11701 1727096115.99833: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 11701 1727096115.99862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11701 1727096116.00102: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11701 1727096116.00367: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11701 1727096116.00578: Loaded config def from plugin (callback/default) 11701 1727096116.00581: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11701 1727096116.02995: Loaded config def from plugin (callback/junit) 11701 1727096116.02999: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11701 1727096116.03099: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11701 1727096116.03278: Loaded config def from plugin (callback/minimal) 11701 1727096116.03280: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11701 1727096116.03321: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11701 1727096116.03495: Loaded config def from plugin (callback/tree) 11701 1727096116.03498: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11701 1727096116.03735: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11701 1727096116.03737: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11701 1727096116.03765: in VariableManager get_vars() 11701 1727096116.03781: done with get_vars() 11701 1727096116.03875: in VariableManager get_vars() 11701 1727096116.03886: done with get_vars() 11701 1727096116.03890: variable 'omit' from source: magic vars 11701 1727096116.03931: in VariableManager get_vars() 11701 1727096116.03944: done with get_vars() 11701 1727096116.03965: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 11701 1727096116.05320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11701 1727096116.05396: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11701 1727096116.05539: getting the remaining hosts for this loop 11701 1727096116.05541: done getting the remaining hosts for this loop 11701 1727096116.05544: getting the next task for host managed_node3 11701 1727096116.05548: done getting next task for host managed_node3 11701 1727096116.05550: ^ task is: TASK: Gathering Facts 11701 1727096116.05552: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096116.05554: getting variables 11701 1727096116.05555: in VariableManager get_vars() 11701 1727096116.05566: Calling all_inventory to load vars for managed_node3 11701 1727096116.05571: Calling groups_inventory to load vars for managed_node3 11701 1727096116.05573: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096116.05585: Calling all_plugins_play to load vars for managed_node3 11701 1727096116.05595: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096116.05598: Calling groups_plugins_play to load vars for managed_node3 11701 1727096116.05629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096116.05684: done with get_vars() 11701 1727096116.05691: done getting variables 11701 1727096116.05752: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Monday 23 September 2024 08:55:16 -0400 (0:00:00.025) 0:00:00.025 ****** 11701 1727096116.06080: entering _queue_task() for managed_node3/gather_facts 11701 1727096116.06081: Creating lock for gather_facts 11701 1727096116.06869: worker is 1 (out of 1 available) 11701 1727096116.06881: exiting _queue_task() for managed_node3/gather_facts 11701 1727096116.06894: done queuing things up, now waiting for results queue to drain 11701 1727096116.06896: waiting for pending results... 11701 1727096116.07389: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11701 1727096116.07395: in run() - task 0afff68d-5257-a05c-c957-0000000000cc 11701 1727096116.07398: variable 'ansible_search_path' from source: unknown 11701 1727096116.07401: calling self._execute() 11701 1727096116.07618: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096116.08074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096116.08078: variable 'omit' from source: magic vars 11701 1727096116.08080: variable 'omit' from source: magic vars 11701 1727096116.08082: variable 'omit' from source: magic vars 11701 1727096116.08084: variable 'omit' from source: magic vars 11701 1727096116.08125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096116.08171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096116.08472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096116.08476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096116.08479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096116.08481: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096116.08483: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096116.08485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096116.08583: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096116.08684: Set connection var ansible_timeout to 10 11701 1727096116.08693: Set connection var ansible_shell_type to sh 11701 1727096116.08703: Set connection var ansible_shell_executable to /bin/sh 11701 1727096116.08710: Set connection var ansible_connection to ssh 11701 1727096116.08724: Set connection var ansible_pipelining to False 11701 1727096116.08754: variable 'ansible_shell_executable' from source: unknown 11701 1727096116.08880: variable 'ansible_connection' from source: unknown 11701 1727096116.09073: variable 'ansible_module_compression' from source: unknown 11701 1727096116.09076: variable 'ansible_shell_type' from source: unknown 11701 1727096116.09079: variable 'ansible_shell_executable' from source: unknown 11701 1727096116.09081: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096116.09083: variable 'ansible_pipelining' from source: unknown 11701 1727096116.09085: variable 'ansible_timeout' from source: unknown 11701 1727096116.09087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096116.09290: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096116.09307: variable 'omit' from source: magic vars 11701 1727096116.09318: starting attempt loop 11701 1727096116.09325: running the handler 11701 1727096116.09345: variable 'ansible_facts' from source: unknown 11701 1727096116.09373: _low_level_execute_command(): starting 11701 1727096116.09385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096116.10770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.10789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.10960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096116.10976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.11032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.12718: stdout chunk (state=3): >>>/root <<< 11701 1727096116.12921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.13100: stderr chunk (state=3): >>><<< 11701 1727096116.13105: stdout chunk (state=3): >>><<< 11701 1727096116.13227: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096116.13231: _low_level_execute_command(): starting 11701 1727096116.13233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434 `" && echo ansible-tmp-1727096116.131316-11741-122039749881434="` echo /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434 `" ) && sleep 0' 11701 1727096116.14074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.14079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.14081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.14304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.16217: stdout chunk (state=3): >>>ansible-tmp-1727096116.131316-11741-122039749881434=/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434 <<< 11701 1727096116.16360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.16378: stdout chunk (state=3): >>><<< 11701 1727096116.16391: stderr chunk (state=3): >>><<< 11701 1727096116.16415: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096116.131316-11741-122039749881434=/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096116.16463: variable 'ansible_module_compression' from source: unknown 11701 1727096116.16527: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11701 1727096116.16541: ANSIBALLZ: Acquiring lock 11701 1727096116.16553: ANSIBALLZ: Lock acquired: 139907404354416 11701 1727096116.16563: ANSIBALLZ: Creating module 11701 1727096116.36161: ANSIBALLZ: Writing module into payload 11701 1727096116.36264: ANSIBALLZ: Writing module 11701 1727096116.36283: ANSIBALLZ: Renaming module 11701 1727096116.36288: ANSIBALLZ: Done creating module 11701 1727096116.36306: variable 'ansible_facts' from source: unknown 11701 1727096116.36312: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096116.36325: _low_level_execute_command(): starting 11701 1727096116.36329: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11701 1727096116.36780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096116.36784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.36794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.36811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096116.36814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.36872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096116.36876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096116.36878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.36927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.38590: stdout chunk (state=3): >>>PLATFORM <<< 11701 1727096116.38662: stdout chunk (state=3): >>>Linux <<< 11701 1727096116.38678: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 11701 1727096116.38685: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11701 1727096116.38813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.38845: stderr chunk (state=3): >>><<< 11701 1727096116.38851: stdout chunk (state=3): >>><<< 11701 1727096116.38865: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096116.38878 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11701 1727096116.38916: _low_level_execute_command(): starting 11701 1727096116.38922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11701 1727096116.39002: Sending initial data 11701 1727096116.39005: Sent initial data (1181 bytes) 11701 1727096116.39376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.39380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096116.39382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.39384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096116.39386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096116.39393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.39440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096116.39443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096116.39447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.39492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.42934: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11701 1727096116.43342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.43378: stderr chunk (state=3): >>><<< 11701 1727096116.43381: stdout chunk (state=3): >>><<< 11701 1727096116.43393: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096116.43449: variable 'ansible_facts' from source: unknown 11701 1727096116.43453: variable 'ansible_facts' from source: unknown 11701 1727096116.43463: variable 'ansible_module_compression' from source: unknown 11701 1727096116.43504: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11701 1727096116.43527: variable 'ansible_facts' from source: unknown 11701 1727096116.43623: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py 11701 1727096116.43742: Sending initial data 11701 1727096116.43745: Sent initial data (153 bytes) 11701 1727096116.44207: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.44211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096116.44213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.44215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096116.44217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096116.44219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.44274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096116.44279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096116.44298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.44326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.45960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096116.45989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096116.46017: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpf2qc_3hk /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py <<< 11701 1727096116.46024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py" <<< 11701 1727096116.46047: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpf2qc_3hk" to remote "/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py" <<< 11701 1727096116.46054: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py" <<< 11701 1727096116.47017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.47062: stderr chunk (state=3): >>><<< 11701 1727096116.47065: stdout chunk (state=3): >>><<< 11701 1727096116.47111: done transferring module to remote 11701 1727096116.47123: _low_level_execute_command(): starting 11701 1727096116.47128: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/ /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py && sleep 0' 11701 1727096116.47697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096116.47712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.47747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096116.47751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096116.47753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.47791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.49628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096116.49656: stderr chunk (state=3): >>><<< 11701 1727096116.49659: stdout chunk (state=3): >>><<< 11701 1727096116.49676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096116.49679: _low_level_execute_command(): starting 11701 1727096116.49684: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/AnsiballZ_setup.py && sleep 0' 11701 1727096116.50137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.50140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.50143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096116.50145: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096116.50147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096116.50192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096116.50204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096116.50251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096116.52487: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11701 1727096116.52503: stdout chunk (state=3): >>>import _imp # builtin <<< 11701 1727096116.52537: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11701 1727096116.52603: stdout chunk (state=3): >>>import '_io' # <<< 11701 1727096116.52608: stdout chunk (state=3): >>>import 'marshal' # <<< 11701 1727096116.52645: stdout chunk (state=3): >>>import 'posix' # <<< 11701 1727096116.52678: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11701 1727096116.52703: stdout chunk (state=3): >>>import 'time' # <<< 11701 1727096116.52714: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11701 1727096116.52766: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.52799: stdout chunk (state=3): >>>import '_codecs' # <<< 11701 1727096116.52814: stdout chunk (state=3): >>>import 'codecs' # <<< 11701 1727096116.53038: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11701 1727096116.53050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 11701 1727096116.53215: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 11701 1727096116.53218: stdout chunk (state=3): >>>Processing global site-packages <<< 11701 1727096116.53295: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11701 1727096116.53299: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11701 1727096116.53311: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818c1130> <<< 11701 1727096116.53371: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.53407: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818c1fa0> <<< 11701 1727096116.53411: stdout chunk (state=3): >>>import 'site' # <<< 11701 1727096116.53436: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11701 1727096116.53808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11701 1727096116.53847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11701 1727096116.53852: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11701 1727096116.53871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.53897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11701 1727096116.53927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11701 1727096116.53945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11701 1727096116.53976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11701 1727096116.54010: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818ffdd0> <<< 11701 1727096116.54038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11701 1727096116.54066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fffe0> <<< 11701 1727096116.54085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11701 1727096116.54108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11701 1727096116.54135: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11701 1727096116.54186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.54205: stdout chunk (state=3): >>>import 'itertools' # <<< 11701 1727096116.54252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81937800> <<< 11701 1727096116.54278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 11701 1727096116.54293: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81937e90> import '_collections' # <<< 11701 1727096116.54343: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81917aa0> <<< 11701 1727096116.54362: stdout chunk (state=3): >>>import '_functools' # <<< 11701 1727096116.54386: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819151c0> <<< 11701 1727096116.54645: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fcf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11701 1727096116.54649: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11701 1727096116.54689: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819576e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81956300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81916060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fee70> <<< 11701 1727096116.54906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8198cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8198cef0> <<< 11701 1727096116.54914: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fad20> <<< 11701 1727096116.54955: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.54977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11701 1727096116.55006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11701 1727096116.55010: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198d280> <<< 11701 1727096116.55250: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11701 1727096116.55265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11701 1727096116.55272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a5d30> <<< 11701 1727096116.55453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11701 1727096116.55458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a6bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a7230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a6120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11701 1727096116.55461: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a7cb0> <<< 11701 1727096116.55536: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a73e0> <<< 11701 1727096116.55539: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198e450> <<< 11701 1727096116.55541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11701 1727096116.55565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11701 1727096116.55787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11701 1727096116.55790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816afb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11701 1727096116.55904: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d83b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.56022: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8fb0> <<< 11701 1727096116.56189: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d9910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d8860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816add60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11701 1727096116.56299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816dacc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d97f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198eba0> <<< 11701 1727096116.56516: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81707020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.56629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8172b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11701 1727096116.56760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # <<< 11701 1727096116.56764: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.56766: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817881a0> <<< 11701 1727096116.56837: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11701 1727096116.56841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11701 1727096116.56843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11701 1727096116.56876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11701 1727096116.56961: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8178a900> <<< 11701 1727096116.57043: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817882c0> <<< 11701 1727096116.57079: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817551c0> <<< 11701 1727096116.57111: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815992e0> <<< 11701 1727096116.57228: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8172a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816dbbf0> <<< 11701 1727096116.57349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6f8172a570> <<< 11701 1727096116.57603: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_h1fcgspl/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11701 1727096116.57722: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.57776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11701 1727096116.57983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11701 1727096116.58005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815fb020> import '_typing' # <<< 11701 1727096116.58114: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815d9f10> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815d90a0> <<< 11701 1727096116.58134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.58163: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11701 1727096116.58192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.58233: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 11701 1727096116.58236: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.59678: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.61223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815f9310> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162a9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a7b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162b740> <<< 11701 1727096116.61246: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162b980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11701 1727096116.61303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11701 1727096116.61318: stdout chunk (state=3): >>>import '_locale' # <<< 11701 1727096116.61356: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162bec0> <<< 11701 1727096116.61376: stdout chunk (state=3): >>>import 'pwd' # <<< 11701 1727096116.61529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11701 1727096116.61533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f29cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f2b8f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11701 1727096116.61574: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2c230> <<< 11701 1727096116.61592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11701 1727096116.61626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11701 1727096116.61641: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2d3d0> <<< 11701 1727096116.61661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11701 1727096116.61702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11701 1727096116.61731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11701 1727096116.61784: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2fe00> <<< 11701 1727096116.61823: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f81757dd0> <<< 11701 1727096116.61944: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2e0c0> <<< 11701 1727096116.61949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11701 1727096116.61974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11701 1727096116.62114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11701 1727096116.62117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11701 1727096116.62120: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f37e30> <<< 11701 1727096116.62134: stdout chunk (state=3): >>>import '_tokenize' # <<< 11701 1727096116.62197: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36660> <<< 11701 1727096116.62233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11701 1727096116.62236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11701 1727096116.62422: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36bd0> <<< 11701 1727096116.62430: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f7bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11701 1727096116.62466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11701 1727096116.62472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11701 1727096116.62511: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.62531: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f7dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7da00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11701 1727096116.62629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7e300> <<< 11701 1727096116.62652: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11701 1727096116.62695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.62843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11701 1727096116.62856: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f83980> <<< 11701 1727096116.62918: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f80380> <<< 11701 1727096116.62983: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84740> <<< 11701 1727096116.63014: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84aa0> <<< 11701 1727096116.63064: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84b30> <<< 11701 1727096116.63097: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7c2c0> <<< 11701 1727096116.63108: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11701 1727096116.63159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11701 1727096116.63184: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.63216: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e103e0> <<< 11701 1727096116.63454: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e11250> <<< 11701 1727096116.63458: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f86b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f87ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f86780> <<< 11701 1727096116.63683: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.63686: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11701 1727096116.63713: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11701 1727096116.63732: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.63851: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.63974: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.64530: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.65097: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 11701 1727096116.65120: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 11701 1727096116.65144: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11701 1727096116.65160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.65211: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e155b0> <<< 11701 1727096116.65294: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11701 1727096116.65375: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e169c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e113a0> import 'ansible.module_utils.compat.selinux' # <<< 11701 1727096116.65399: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.65431: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 11701 1727096116.65448: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.65588: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.65756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11701 1727096116.65778: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e16b10> # zipimport: zlib available <<< 11701 1727096116.66257: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.66716: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.66786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.66871: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11701 1727096116.66966: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.66991: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11701 1727096116.67030: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.67142: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11701 1727096116.67147: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.67171: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11701 1727096116.67209: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.67316: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11701 1727096116.67325: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.67508: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.67750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11701 1727096116.67992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e176e0> # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.68069: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 11701 1727096116.68094: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11701 1727096116.68112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68152: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68186: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11701 1727096116.68203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68241: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68285: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68605: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e21f40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e1fd70> <<< 11701 1727096116.68632: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11701 1727096116.68691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68756: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68781: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.68823: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 11701 1727096116.68844: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11701 1727096116.68870: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11701 1727096116.68889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11701 1727096116.68955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11701 1727096116.68983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11701 1727096116.68995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11701 1727096116.69197: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f0a810> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8164a4e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e21d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e16f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11701 1727096116.69215: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69241: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11701 1727096116.69296: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11701 1727096116.69311: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69497: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.69541: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69578: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69647: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11701 1727096116.69669: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69734: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.69982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.70003: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11701 1727096116.70049: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.70218: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.70258: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.70315: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096116.70342: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11701 1727096116.70366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11701 1727096116.70380: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11701 1727096116.70423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb5fa0> <<< 11701 1727096116.70460: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11701 1727096116.70525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11701 1727096116.70553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11701 1727096116.70574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80adfef0> <<< 11701 1727096116.70620: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.70623: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae4290> <<< 11701 1727096116.70685: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e9eab0> <<< 11701 1727096116.70689: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb6ab0> <<< 11701 1727096116.70725: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb4620> <<< 11701 1727096116.70780: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb42c0> <<< 11701 1727096116.70783: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11701 1727096116.70817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11701 1727096116.70833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11701 1727096116.71119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11701 1727096116.71123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae71d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae6ab0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae6c60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae5ee0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae7260> <<< 11701 1727096116.71140: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11701 1727096116.71166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11701 1727096116.71196: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b4dd30> <<< 11701 1727096116.71223: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae7d10> <<< 11701 1727096116.71390: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb42f0> import 'ansible.module_utils.facts.timeout' # <<< 11701 1727096116.71394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11701 1727096116.71422: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.71444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 11701 1727096116.71495: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11701 1727096116.71578: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11701 1727096116.71590: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11701 1727096116.71666: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71702: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11701 1727096116.71771: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71805: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11701 1727096116.71862: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.71991: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.72025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.72082: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 11701 1727096116.72101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11701 1727096116.72687: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11701 1727096116.73022: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73064: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73116: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73400: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 11701 1727096116.73416: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11701 1727096116.73460: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73484: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11701 1727096116.73530: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73602: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11701 1727096116.73719: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4f8c0> <<< 11701 1727096116.73737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11701 1727096116.73777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11701 1727096116.73896: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4e7b0> import 'ansible.module_utils.facts.system.local' # <<< 11701 1727096116.73909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.73973: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11701 1727096116.74056: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11701 1727096116.74239: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74294: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11701 1727096116.74520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11701 1727096116.74592: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.74651: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.74673: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b81e80> <<< 11701 1727096116.74851: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4f4d0> import 'ansible.module_utils.facts.system.python' # <<< 11701 1727096116.74870: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74926: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.74974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11701 1727096116.74994: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75076: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75267: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75418: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11701 1727096116.75434: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75464: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11701 1727096116.75615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11701 1727096116.75637: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096116.75682: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b99970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b995b0> import 'ansible.module_utils.facts.system.user' # <<< 11701 1727096116.75704: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11701 1727096116.75749: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11701 1727096116.75804: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.75973: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11701 1727096116.76218: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76309: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76351: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 11701 1727096116.76405: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11701 1727096116.76429: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76452: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76589: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.76781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 11701 1727096116.76797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11701 1727096116.76864: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.77019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11701 1727096116.77027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.77046: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.77060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.78092: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.78338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 11701 1727096116.78342: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.78498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11701 1727096116.78542: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.78603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11701 1727096116.78621: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79100: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 11701 1727096116.79149: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79253: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79588: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11701 1727096116.79691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79712: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.79755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11701 1727096116.79823: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.79845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 11701 1727096116.79896: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.80006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.80235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.80260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11701 1727096116.80531: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.80922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.80939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 11701 1727096116.81003: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 11701 1727096116.81030: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81326: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.81329: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # <<< 11701 1727096116.81344: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 11701 1727096116.81365: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81391: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 11701 1727096116.81470: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81487: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81650: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11701 1727096116.81739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11701 1727096116.81787: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.81836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 11701 1727096116.81982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82050: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82247: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11701 1727096116.82264: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82299: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11701 1727096116.82363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82401: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11701 1727096116.82480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.82812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096116.82820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11701 1727096116.82943: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096116.83079: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11701 1727096116.83123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11701 1727096116.83153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8092e330> <<< 11701 1727096116.83175: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8092cda0> <<< 11701 1727096116.83215: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80927d40> <<< 11701 1727096116.95676: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11701 1727096116.96000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80976c30> <<< 11701 1727096116.96016: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80975040> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80977380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80976240> <<< 11701 1727096116.96191: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11701 1727096117.20247: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "16", "epoch": "1727096116", "epoch_int": "1727096116", "date": "2024-09-23", "time": "08:55:16", "iso8601_micro": "2024-09-23T12:55:16.833294Z", "iso8601": "2024-09-23T12:55:16Z", "iso8601_basic": "20240923T085516833294", "iso8601_basic_short": "20240923T085516", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "r<<< 11701 1727096117.20289: stdout chunk (state=3): >>>o": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.6357421875, "5m": 0.51611328125, "15m": 0.23828125}, "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2995, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 536, "free": 2995}, "nocache": {"free": 3309, "used": 222}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 259, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806272512, "block_size": 4096, "block_total": 65519099, "block_available": 63917547, "block_used": 1601552, "inode_total": 131070960, "inode_available": 131029186, "inode_used": 41774, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11701 1727096117.20912: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11701 1727096117.21221: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 11701 1727096117.21225: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg <<< 11701 1727096117.21243: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11701 1727096117.21571: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11701 1727096117.21600: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 11701 1727096117.21626: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11701 1727096117.21647: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11701 1727096117.21673: stdout chunk (state=3): >>># destroy ntpath <<< 11701 1727096117.21709: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 11701 1727096117.21818: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 11701 1727096117.21821: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 11701 1727096117.21840: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11701 1727096117.21916: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 11701 1727096117.21978: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 11701 1727096117.22009: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 11701 1727096117.22155: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 11701 1727096117.22161: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 11701 1727096117.22197: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 11701 1727096117.22364: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 11701 1727096117.22368: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 11701 1727096117.22382: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11701 1727096117.22509: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11701 1727096117.22530: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 11701 1727096117.22570: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11701 1727096117.22633: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 11701 1727096117.22688: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11701 1727096117.22823: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 11701 1727096117.22872: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11701 1727096117.23070: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11701 1727096117.23075: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11701 1727096117.23097: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11701 1727096117.23271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.23283: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 11701 1727096117.23340: stderr chunk (state=3): >>><<< 11701 1727096117.23351: stdout chunk (state=3): >>><<< 11701 1727096117.23644: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818ffdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81937800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81937e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81917aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819151c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fcf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819576e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81956300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81916060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8198cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8198cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f818fad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198d280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a5d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a6bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a7230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a6120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f819a7cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f819a73e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198e450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816afb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d83b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d8fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f816d9910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d8860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816add60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816dacc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816d97f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8198eba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81707020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8172b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817881a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8178a900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817882c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f817551c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815992e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8172a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f816dbbf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6f8172a570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_h1fcgspl/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815fb020> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815d9f10> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815d90a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f815f9310> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162a9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162a7b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f81b129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162b740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8162b980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8162bec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f29cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f2b8f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2d3d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f81757dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f37e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f36bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f2e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f7bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f7dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7da00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f83980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f80380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f84b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f7c2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e103e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e11250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f86b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80f87ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f86780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e155b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e169c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e113a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e16b10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e176e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80e21f40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e1fd70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80f0a810> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8164a4e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e21d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e16f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb5fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80adfef0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae4290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80e9eab0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb6ab0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb4620> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb42c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae71d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae6ab0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80ae6c60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae5ee0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae7260> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b4dd30> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80ae7d10> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80eb42f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4f8c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4e7b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b81e80> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b4f4d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f80b99970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80b995b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f8092e330> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f8092cda0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80927d40> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80976c30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80975040> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80977380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f80976240> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "16", "epoch": "1727096116", "epoch_int": "1727096116", "date": "2024-09-23", "time": "08:55:16", "iso8601_micro": "2024-09-23T12:55:16.833294Z", "iso8601": "2024-09-23T12:55:16Z", "iso8601_basic": "20240923T085516833294", "iso8601_basic_short": "20240923T085516", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.6357421875, "5m": 0.51611328125, "15m": 0.23828125}, "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2995, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 536, "free": 2995}, "nocache": {"free": 3309, "used": 222}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 259, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806272512, "block_size": 4096, "block_total": 65519099, "block_available": 63917547, "block_used": 1601552, "inode_total": 131070960, "inode_available": 131029186, "inode_used": 41774, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11701 1727096117.27385: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096117.27389: _low_level_execute_command(): starting 11701 1727096117.27392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096116.131316-11741-122039749881434/ > /dev/null 2>&1 && sleep 0' 11701 1727096117.28587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096117.28788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096117.28944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.28978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.30898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.30903: stdout chunk (state=3): >>><<< 11701 1727096117.30906: stderr chunk (state=3): >>><<< 11701 1727096117.30924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096117.30975: handler run complete 11701 1727096117.31273: variable 'ansible_facts' from source: unknown 11701 1727096117.31573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.32153: variable 'ansible_facts' from source: unknown 11701 1727096117.32197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.32490: attempt loop complete, returning result 11701 1727096117.32499: _execute() done 11701 1727096117.32504: dumping result to json 11701 1727096117.32530: done dumping result, returning 11701 1727096117.32591: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-a05c-c957-0000000000cc] 11701 1727096117.32599: sending task result for task 0afff68d-5257-a05c-c957-0000000000cc 11701 1727096117.33791: done sending task result for task 0afff68d-5257-a05c-c957-0000000000cc 11701 1727096117.33795: WORKER PROCESS EXITING ok: [managed_node3] 11701 1727096117.34519: no more pending results, returning what we have 11701 1727096117.34523: results queue empty 11701 1727096117.34524: checking for any_errors_fatal 11701 1727096117.34525: done checking for any_errors_fatal 11701 1727096117.34526: checking for max_fail_percentage 11701 1727096117.34527: done checking for max_fail_percentage 11701 1727096117.34528: checking to see if all hosts have failed and the running result is not ok 11701 1727096117.34528: done checking to see if all hosts have failed 11701 1727096117.34529: getting the remaining hosts for this loop 11701 1727096117.34531: done getting the remaining hosts for this loop 11701 1727096117.34534: getting the next task for host managed_node3 11701 1727096117.34540: done getting next task for host managed_node3 11701 1727096117.34542: ^ task is: TASK: meta (flush_handlers) 11701 1727096117.34544: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096117.34547: getting variables 11701 1727096117.34548: in VariableManager get_vars() 11701 1727096117.34574: Calling all_inventory to load vars for managed_node3 11701 1727096117.34577: Calling groups_inventory to load vars for managed_node3 11701 1727096117.34580: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096117.34591: Calling all_plugins_play to load vars for managed_node3 11701 1727096117.34593: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096117.34596: Calling groups_plugins_play to load vars for managed_node3 11701 1727096117.34763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.35163: done with get_vars() 11701 1727096117.35176: done getting variables 11701 1727096117.35240: in VariableManager get_vars() 11701 1727096117.35249: Calling all_inventory to load vars for managed_node3 11701 1727096117.35251: Calling groups_inventory to load vars for managed_node3 11701 1727096117.35254: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096117.35258: Calling all_plugins_play to load vars for managed_node3 11701 1727096117.35261: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096117.35264: Calling groups_plugins_play to load vars for managed_node3 11701 1727096117.35603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.36002: done with get_vars() 11701 1727096117.36016: done queuing things up, now waiting for results queue to drain 11701 1727096117.36018: results queue empty 11701 1727096117.36019: checking for any_errors_fatal 11701 1727096117.36022: done checking for any_errors_fatal 11701 1727096117.36023: checking for max_fail_percentage 11701 1727096117.36024: done checking for max_fail_percentage 11701 1727096117.36024: checking to see if all hosts have failed and the running result is not ok 11701 1727096117.36025: done checking to see if all hosts have failed 11701 1727096117.36031: getting the remaining hosts for this loop 11701 1727096117.36032: done getting the remaining hosts for this loop 11701 1727096117.36035: getting the next task for host managed_node3 11701 1727096117.36040: done getting next task for host managed_node3 11701 1727096117.36042: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11701 1727096117.36044: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096117.36046: getting variables 11701 1727096117.36047: in VariableManager get_vars() 11701 1727096117.36055: Calling all_inventory to load vars for managed_node3 11701 1727096117.36057: Calling groups_inventory to load vars for managed_node3 11701 1727096117.36059: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096117.36064: Calling all_plugins_play to load vars for managed_node3 11701 1727096117.36066: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096117.36474: Calling groups_plugins_play to load vars for managed_node3 11701 1727096117.36607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.36785: done with get_vars() 11701 1727096117.36794: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Monday 23 September 2024 08:55:17 -0400 (0:00:01.307) 0:00:01.337 ****** 11701 1727096117.37279: entering _queue_task() for managed_node3/include_tasks 11701 1727096117.37281: Creating lock for include_tasks 11701 1727096117.37813: worker is 1 (out of 1 available) 11701 1727096117.37827: exiting _queue_task() for managed_node3/include_tasks 11701 1727096117.37838: done queuing things up, now waiting for results queue to drain 11701 1727096117.37839: waiting for pending results... 11701 1727096117.38287: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 11701 1727096117.38495: in run() - task 0afff68d-5257-a05c-c957-000000000006 11701 1727096117.38517: variable 'ansible_search_path' from source: unknown 11701 1727096117.38558: calling self._execute() 11701 1727096117.38717: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096117.38729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096117.38742: variable 'omit' from source: magic vars 11701 1727096117.38961: _execute() done 11701 1727096117.38974: dumping result to json 11701 1727096117.39042: done dumping result, returning 11701 1727096117.39055: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-a05c-c957-000000000006] 11701 1727096117.39067: sending task result for task 0afff68d-5257-a05c-c957-000000000006 11701 1727096117.39373: done sending task result for task 0afff68d-5257-a05c-c957-000000000006 11701 1727096117.39377: WORKER PROCESS EXITING 11701 1727096117.39416: no more pending results, returning what we have 11701 1727096117.39421: in VariableManager get_vars() 11701 1727096117.39453: Calling all_inventory to load vars for managed_node3 11701 1727096117.39456: Calling groups_inventory to load vars for managed_node3 11701 1727096117.39460: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096117.39473: Calling all_plugins_play to load vars for managed_node3 11701 1727096117.39476: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096117.39480: Calling groups_plugins_play to load vars for managed_node3 11701 1727096117.39890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.40373: done with get_vars() 11701 1727096117.40382: variable 'ansible_search_path' from source: unknown 11701 1727096117.40398: we have included files to process 11701 1727096117.40400: generating all_blocks data 11701 1727096117.40401: done generating all_blocks data 11701 1727096117.40402: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11701 1727096117.40403: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11701 1727096117.40406: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11701 1727096117.42486: in VariableManager get_vars() 11701 1727096117.42505: done with get_vars() 11701 1727096117.42517: done processing included file 11701 1727096117.42519: iterating over new_blocks loaded from include file 11701 1727096117.42521: in VariableManager get_vars() 11701 1727096117.42531: done with get_vars() 11701 1727096117.42532: filtering new block on tags 11701 1727096117.42547: done filtering new block on tags 11701 1727096117.42550: in VariableManager get_vars() 11701 1727096117.42561: done with get_vars() 11701 1727096117.42562: filtering new block on tags 11701 1727096117.42579: done filtering new block on tags 11701 1727096117.42582: in VariableManager get_vars() 11701 1727096117.42592: done with get_vars() 11701 1727096117.42593: filtering new block on tags 11701 1727096117.42606: done filtering new block on tags 11701 1727096117.42608: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 11701 1727096117.42614: extending task lists for all hosts with included blocks 11701 1727096117.42662: done extending task lists 11701 1727096117.42664: done processing included files 11701 1727096117.42665: results queue empty 11701 1727096117.42665: checking for any_errors_fatal 11701 1727096117.43069: done checking for any_errors_fatal 11701 1727096117.43072: checking for max_fail_percentage 11701 1727096117.43073: done checking for max_fail_percentage 11701 1727096117.43074: checking to see if all hosts have failed and the running result is not ok 11701 1727096117.43075: done checking to see if all hosts have failed 11701 1727096117.43076: getting the remaining hosts for this loop 11701 1727096117.43077: done getting the remaining hosts for this loop 11701 1727096117.43080: getting the next task for host managed_node3 11701 1727096117.43084: done getting next task for host managed_node3 11701 1727096117.43087: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11701 1727096117.43089: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096117.43091: getting variables 11701 1727096117.43093: in VariableManager get_vars() 11701 1727096117.43102: Calling all_inventory to load vars for managed_node3 11701 1727096117.43105: Calling groups_inventory to load vars for managed_node3 11701 1727096117.43108: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096117.43114: Calling all_plugins_play to load vars for managed_node3 11701 1727096117.43116: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096117.43119: Calling groups_plugins_play to load vars for managed_node3 11701 1727096117.43692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096117.44277: done with get_vars() 11701 1727096117.44288: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:55:17 -0400 (0:00:00.070) 0:00:01.408 ****** 11701 1727096117.44357: entering _queue_task() for managed_node3/setup 11701 1727096117.45506: worker is 1 (out of 1 available) 11701 1727096117.45514: exiting _queue_task() for managed_node3/setup 11701 1727096117.45523: done queuing things up, now waiting for results queue to drain 11701 1727096117.45524: waiting for pending results... 11701 1727096117.45685: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 11701 1727096117.45874: in run() - task 0afff68d-5257-a05c-c957-0000000000dd 11701 1727096117.45878: variable 'ansible_search_path' from source: unknown 11701 1727096117.45880: variable 'ansible_search_path' from source: unknown 11701 1727096117.45908: calling self._execute() 11701 1727096117.45990: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096117.46083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096117.46099: variable 'omit' from source: magic vars 11701 1727096117.47201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096117.51689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096117.51797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096117.51841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096117.52012: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096117.52042: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096117.52240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096117.52280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096117.52502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096117.52612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096117.52616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096117.52814: variable 'ansible_facts' from source: unknown 11701 1727096117.53010: variable 'network_test_required_facts' from source: task vars 11701 1727096117.53179: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11701 1727096117.53191: variable 'omit' from source: magic vars 11701 1727096117.53232: variable 'omit' from source: magic vars 11701 1727096117.53473: variable 'omit' from source: magic vars 11701 1727096117.53478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096117.53481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096117.53483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096117.53708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096117.53712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096117.53715: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096117.53718: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096117.53721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096117.53932: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096117.53974: Set connection var ansible_timeout to 10 11701 1727096117.53977: Set connection var ansible_shell_type to sh 11701 1727096117.53979: Set connection var ansible_shell_executable to /bin/sh 11701 1727096117.53982: Set connection var ansible_connection to ssh 11701 1727096117.53983: Set connection var ansible_pipelining to False 11701 1727096117.54006: variable 'ansible_shell_executable' from source: unknown 11701 1727096117.54014: variable 'ansible_connection' from source: unknown 11701 1727096117.54021: variable 'ansible_module_compression' from source: unknown 11701 1727096117.54030: variable 'ansible_shell_type' from source: unknown 11701 1727096117.54258: variable 'ansible_shell_executable' from source: unknown 11701 1727096117.54261: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096117.54264: variable 'ansible_pipelining' from source: unknown 11701 1727096117.54266: variable 'ansible_timeout' from source: unknown 11701 1727096117.54270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096117.54328: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096117.54380: variable 'omit' from source: magic vars 11701 1727096117.54572: starting attempt loop 11701 1727096117.54577: running the handler 11701 1727096117.54580: _low_level_execute_command(): starting 11701 1727096117.54582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096117.55917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096117.55985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096117.56153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096117.56282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.56366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.58061: stdout chunk (state=3): >>>/root <<< 11701 1727096117.58488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.58512: stderr chunk (state=3): >>><<< 11701 1727096117.58521: stdout chunk (state=3): >>><<< 11701 1727096117.58589: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096117.58870: _low_level_execute_command(): starting 11701 1727096117.58874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600 `" && echo ansible-tmp-1727096117.5877464-11812-18361970020600="` echo /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600 `" ) && sleep 0' 11701 1727096117.59942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096117.60085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096117.60178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096117.60193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.60365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.62371: stdout chunk (state=3): >>>ansible-tmp-1727096117.5877464-11812-18361970020600=/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600 <<< 11701 1727096117.62456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.62674: stderr chunk (state=3): >>><<< 11701 1727096117.62677: stdout chunk (state=3): >>><<< 11701 1727096117.62680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096117.5877464-11812-18361970020600=/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096117.62923: variable 'ansible_module_compression' from source: unknown 11701 1727096117.62926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11701 1727096117.62987: variable 'ansible_facts' from source: unknown 11701 1727096117.63340: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py 11701 1727096117.63796: Sending initial data 11701 1727096117.63806: Sent initial data (153 bytes) 11701 1727096117.65710: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096117.65955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.66074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.67633: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096117.67792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096117.67831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py" <<< 11701 1727096117.67859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjakxetll /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py <<< 11701 1727096117.68180: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjakxetll" to remote "/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py" <<< 11701 1727096117.70440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.70484: stdout chunk (state=3): >>><<< 11701 1727096117.70497: stderr chunk (state=3): >>><<< 11701 1727096117.70521: done transferring module to remote 11701 1727096117.70594: _low_level_execute_command(): starting 11701 1727096117.70603: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/ /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py && sleep 0' 11701 1727096117.72148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096117.72173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096117.72396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096117.72474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.72581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.74406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096117.74552: stderr chunk (state=3): >>><<< 11701 1727096117.74556: stdout chunk (state=3): >>><<< 11701 1727096117.74575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096117.74583: _low_level_execute_command(): starting 11701 1727096117.74591: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/AnsiballZ_setup.py && sleep 0' 11701 1727096117.76007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096117.76010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096117.76012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096117.76015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096117.76017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096117.76045: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096117.76059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096117.76137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096117.76284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096117.76516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096117.78796: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11701 1727096117.78815: stdout chunk (state=3): >>>import _imp # builtin <<< 11701 1727096117.78840: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11701 1727096117.78911: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11701 1727096117.78952: stdout chunk (state=3): >>>import 'posix' # <<< 11701 1727096117.78987: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11701 1727096117.79019: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11701 1727096117.79071: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11701 1727096117.79093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.79120: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 11701 1727096117.79158: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11701 1727096117.79190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11701 1727096117.79201: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975e184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975de7b30> <<< 11701 1727096117.79237: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11701 1727096117.79249: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975e1aa50> <<< 11701 1727096117.79274: stdout chunk (state=3): >>>import '_signal' # <<< 11701 1727096117.79293: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11701 1727096117.79314: stdout chunk (state=3): >>>import 'io' # <<< 11701 1727096117.79350: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11701 1727096117.79446: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11701 1727096117.79470: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11701 1727096117.79501: stdout chunk (state=3): >>>import 'os' # <<< 11701 1727096117.79537: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11701 1727096117.79580: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 11701 1727096117.79605: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 11701 1727096117.79608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11701 1727096117.79626: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975be9130> <<< 11701 1727096117.79691: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.79708: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975be9fa0> <<< 11701 1727096117.79735: stdout chunk (state=3): >>>import 'site' # <<< 11701 1727096117.79771: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11701 1727096117.80177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11701 1727096117.80181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11701 1727096117.80208: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.80233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11701 1727096117.80288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11701 1727096117.80291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11701 1727096117.80353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11701 1727096117.80357: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c27e90> <<< 11701 1727096117.80455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11701 1727096117.80536: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c27f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11701 1727096117.80580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c5f890> <<< 11701 1727096117.80671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c5ff20> import '_collections' # <<< 11701 1727096117.80810: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3fb60> import '_functools' # <<< 11701 1727096117.80873: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c25040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11701 1727096117.80888: stdout chunk (state=3): >>>import '_sre' # <<< 11701 1727096117.80914: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11701 1727096117.80954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11701 1727096117.80977: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11701 1727096117.81038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7f800> <<< 11701 1727096117.81052: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7cc80> <<< 11701 1727096117.81173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11701 1727096117.81229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb4890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c242c0> <<< 11701 1727096117.81233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975cb4d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb4bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.81245: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975cb4fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c22de0> <<< 11701 1727096117.81269: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 11701 1727096117.81373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11701 1727096117.81406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb53a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb65d0> <<< 11701 1727096117.81450: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11701 1727096117.81461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11701 1727096117.81535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11701 1727096117.81623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975ccc7a0> <<< 11701 1727096117.81661: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccdeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11701 1727096117.81664: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cced50> <<< 11701 1727096117.81696: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccf380> <<< 11701 1727096117.81716: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cce2a0> <<< 11701 1727096117.81748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11701 1727096117.81791: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.81809: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccfe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975ccf530> <<< 11701 1727096117.81870: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb6570> <<< 11701 1727096117.81920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11701 1727096117.81947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11701 1727096117.81998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11701 1727096117.82011: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759c3ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11701 1727096117.82093: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ec830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759ec590> <<< 11701 1727096117.82109: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ec770> <<< 11701 1727096117.82122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11701 1727096117.82189: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.82316: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ed100> <<< 11701 1727096117.82479: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759edaf0> <<< 11701 1727096117.82511: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759ec9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759c1e80> <<< 11701 1727096117.82526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11701 1727096117.82556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11701 1727096117.82597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11701 1727096117.82602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11701 1727096117.82632: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759eeea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759edc10> <<< 11701 1727096117.82675: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb6cc0> <<< 11701 1727096117.82678: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11701 1727096117.82756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.82774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11701 1727096117.82801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11701 1727096117.82832: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a17230> <<< 11701 1727096117.82898: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11701 1727096117.82912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.82930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11701 1727096117.82952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11701 1727096117.82989: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a3b590> <<< 11701 1727096117.83015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11701 1727096117.83067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11701 1727096117.83116: stdout chunk (state=3): >>>import 'ntpath' # <<< 11701 1727096117.83147: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9c2f0> <<< 11701 1727096117.83164: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11701 1727096117.83202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11701 1727096117.83225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11701 1727096117.83272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11701 1727096117.83360: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9ea50> <<< 11701 1727096117.83445: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9c410> <<< 11701 1727096117.83481: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a613a0> <<< 11701 1727096117.83509: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 11701 1727096117.83536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a3a3c0> <<< 11701 1727096117.83553: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759efe00> <<< 11701 1727096117.83733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11701 1727096117.83750: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2975325670> <<< 11701 1727096117.84020: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_mcm8pnww/ansible_setup_payload.zip' # zipimport: zlib available <<< 11701 1727096117.84285: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11701 1727096117.84321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11701 1727096117.84348: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538f0e0> <<< 11701 1727096117.84366: stdout chunk (state=3): >>>import '_typing' # <<< 11701 1727096117.84547: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297536dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297536d190> <<< 11701 1727096117.84566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.84673: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11701 1727096117.84677: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.84689: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11701 1727096117.86124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.87379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 11701 1727096117.87636: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538d3d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753beb70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11701 1727096117.87660: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538fd70> <<< 11701 1727096117.87703: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753bf890> <<< 11701 1727096117.87727: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753bfad0> <<< 11701 1727096117.87730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11701 1727096117.87790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11701 1727096117.87802: stdout chunk (state=3): >>>import '_locale' # <<< 11701 1727096117.87853: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753bffe0> import 'pwd' # <<< 11701 1727096117.87878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11701 1727096117.87905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11701 1727096117.87957: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975229df0> <<< 11701 1727096117.87985: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297522ba10> <<< 11701 1727096117.88022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11701 1727096117.88026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11701 1727096117.88156: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522c410> <<< 11701 1727096117.88183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11701 1727096117.88198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11701 1727096117.88221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11701 1727096117.88284: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522ffb0> <<< 11701 1727096117.88321: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752343e0> <<< 11701 1727096117.88346: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522e330> <<< 11701 1727096117.88372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11701 1727096117.88619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11701 1727096117.88623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975237f50> <<< 11701 1727096117.88626: stdout chunk (state=3): >>>import '_tokenize' # <<< 11701 1727096117.88699: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975236a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29752367b0> <<< 11701 1727096117.88728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11701 1727096117.88740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11701 1727096117.88798: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975236cf0> <<< 11701 1727096117.88832: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522e840> <<< 11701 1727096117.88865: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297527c140> <<< 11701 1727096117.88896: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527c290> <<< 11701 1727096117.88984: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11701 1727096117.89014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297527dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527db50> <<< 11701 1727096117.89034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11701 1727096117.89068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11701 1727096117.89153: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752802f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527e480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11701 1727096117.89194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.89258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11701 1727096117.89285: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975283aa0> <<< 11701 1727096117.89508: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975280470> <<< 11701 1727096117.89512: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752848f0> <<< 11701 1727096117.89524: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975284ad0> <<< 11701 1727096117.89574: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975284e60> <<< 11701 1727096117.89684: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527c4d0> <<< 11701 1727096117.89703: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.89724: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297510c3e0> <<< 11701 1727096117.89869: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.89898: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297510d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975286ba0> <<< 11701 1727096117.90051: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975287f50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29752867e0> # zipimport: zlib available <<< 11701 1727096117.90055: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11701 1727096117.90060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.90078: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.90174: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.90257: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.90273: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11701 1727096117.90363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.90483: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.91070: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.91674: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 11701 1727096117.91807: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.91811: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975115610> <<< 11701 1727096117.91852: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11701 1727096117.91864: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975116360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297510d5e0> <<< 11701 1727096117.91912: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11701 1727096117.92003: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.92024: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11701 1727096117.92127: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.92396: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11701 1727096117.92410: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751161b0> # zipimport: zlib available <<< 11701 1727096117.92797: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.93379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.93413: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 11701 1727096117.93458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.93500: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11701 1727096117.93512: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.93700: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11701 1727096117.93740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.93783: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11701 1727096117.93803: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94031: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11701 1727096117.94352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11701 1727096117.94365: stdout chunk (state=3): >>>import '_ast' # <<< 11701 1727096117.94423: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975117530> <<< 11701 1727096117.94439: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94507: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94671: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11701 1727096117.94721: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 11701 1727096117.94726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94770: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94872: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.94940: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11701 1727096117.95008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096117.95139: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975122300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297511cd70> <<< 11701 1727096117.95241: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11701 1727096117.95244: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.95294: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.95336: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.95387: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11701 1727096117.95417: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11701 1727096117.95559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11701 1727096117.95598: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297520ac30> <<< 11701 1727096117.95641: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753ea900> <<< 11701 1727096117.95723: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975117290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11701 1727096117.95802: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.95806: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 11701 1727096117.95808: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 11701 1727096117.95894: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11701 1727096117.95990: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.96036: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96054: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96113: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96164: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96282: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 11701 1727096117.96329: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96401: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96548: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96554: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11701 1727096117.96662: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96828: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96991: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.96997: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11701 1727096117.97013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11701 1727096117.97038: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b23f0> <<< 11701 1727096117.97133: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11701 1727096117.97189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11701 1727096117.97201: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df02f0> <<< 11701 1727096117.97246: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.97251: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df0620> <<< 11701 1727096117.97374: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975198e90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b2f30> <<< 11701 1727096117.97377: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b0a70> <<< 11701 1727096117.97379: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b05f0> <<< 11701 1727096117.97451: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11701 1727096117.97473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11701 1727096117.97479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11701 1727096117.97559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df35f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df2ea0> <<< 11701 1727096117.97578: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096117.97596: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df3050> <<< 11701 1727096117.98099: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df22d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df37a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974e4e2a0> <<< 11701 1727096117.98103: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4c2c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b1b80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.98136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11701 1727096117.98140: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98182: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11701 1727096117.98253: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98274: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11701 1727096117.98287: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98324: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.98466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11701 1727096117.98471: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98508: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11701 1727096117.98567: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98656: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.98761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.98796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 11701 1727096117.98811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11701 1727096117.99390: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.99799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11701 1727096117.99803: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096117.99993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096117.99997: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 11701 1727096118.00000: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00074: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11701 1727096118.00184: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 11701 1727096118.00198: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11701 1727096118.00303: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11701 1727096118.00444: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 11701 1727096118.00613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4f5c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11701 1727096118.00792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4ee40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11701 1727096118.00806: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00875: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11701 1727096118.00942: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.00981: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11701 1727096118.01111: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11701 1727096118.01222: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11701 1727096118.01377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11701 1727096118.01437: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.01559: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974e8e3f0> <<< 11701 1727096118.01739: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e7e1b0> <<< 11701 1727096118.01772: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 11701 1727096118.01775: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11701 1727096118.01944: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.01996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02108: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02300: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 11701 1727096118.02303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 11701 1727096118.02313: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.02508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11701 1727096118.02514: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.02542: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974ea1e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e7f3b0> <<< 11701 1727096118.02547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 11701 1727096118.02566: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 11701 1727096118.02582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02623: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11701 1727096118.02678: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02839: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.02990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11701 1727096118.02993: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03089: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03189: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 11701 1727096118.03397: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.03473: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03625: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11701 1727096118.03783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.03866: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11701 1727096118.03936: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.03955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.04522: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11701 1727096118.05078: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05258: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11701 1727096118.05301: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 11701 1727096118.05666: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 11701 1727096118.05952: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.05966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 11701 1727096118.06072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06396: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.06695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 11701 1727096118.06711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11701 1727096118.06757: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06844: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.06892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11701 1727096118.06964: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11701 1727096118.07020: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.07092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11701 1727096118.07095: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.07288: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11701 1727096118.07500: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.07830: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11701 1727096118.07844: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.07888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11701 1727096118.08089: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.08129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11701 1727096118.08262: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.08294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11701 1727096118.08316: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.08332: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 11701 1727096118.08395: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.08434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11701 1727096118.08502: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.08543: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.08607: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.08736: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.08742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 11701 1727096118.08761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11701 1727096118.08854: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.08865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11701 1727096118.09143: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11701 1727096118.09400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09442: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11701 1727096118.09512: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09600: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09668: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11701 1727096118.09682: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.09925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11701 1727096118.09955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.10174: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11701 1727096118.10279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974c9ee40> <<< 11701 1727096118.10282: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974c9ec00> <<< 11701 1727096118.10363: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974c9d760> <<< 11701 1727096118.11457: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "18", "epoch": "1727096118", "epoch_int": "1727096118", "date": "2024-09-23", "time": "08:55:18", "iso8601_micro": "2024-09-23T12:55:18.110459Z", "iso8601": "2024-09-23T12:55:18Z", "iso8601_basic": "20240923T085518110459", "iso8601_basic_short": "20240923T085518", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11701 1727096118.12088: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 11701 1727096118.12323: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection <<< 11701 1727096118.12336: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg <<< 11701 1727096118.12339: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos <<< 11701 1727096118.12488: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11701 1727096118.12691: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11701 1727096118.12719: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11701 1727096118.12748: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 11701 1727096118.12758: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11701 1727096118.12775: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11701 1727096118.12804: stdout chunk (state=3): >>># destroy ntpath <<< 11701 1727096118.12833: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 11701 1727096118.12877: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 11701 1727096118.12975: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 11701 1727096118.12983: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11701 1727096118.12992: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 11701 1727096118.13080: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11701 1727096118.13084: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 11701 1727096118.13155: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 11701 1727096118.13186: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 11701 1727096118.13191: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11701 1727096118.13235: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 11701 1727096118.13276: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 11701 1727096118.13319: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 11701 1727096118.13408: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 11701 1727096118.13474: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 11701 1727096118.13491: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11701 1727096118.13633: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11701 1727096118.13638: stdout chunk (state=3): >>># destroy _socket <<< 11701 1727096118.13653: stdout chunk (state=3): >>># destroy _collections <<< 11701 1727096118.13735: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11701 1727096118.13740: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 11701 1727096118.13844: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 11701 1727096118.13847: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11701 1727096118.13885: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 11701 1727096118.13955: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11701 1727096118.13958: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11701 1727096118.14062: stdout chunk (state=3): >>># destroy _hashlib <<< 11701 1727096118.14066: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11701 1727096118.14451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096118.14455: stderr chunk (state=3): >>><<< 11701 1727096118.14462: stdout chunk (state=3): >>><<< 11701 1727096118.14689: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975e184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975de7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975e1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975be9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975be9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c27e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c27f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c5f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c5ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c25040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c3e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c7cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb4890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c242c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975cb4d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb4bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975cb4fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975c22de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb53a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb65d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975ccc7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccdeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cced50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccf380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cce2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975ccfe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975ccf530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb6570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759c3ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ec830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759ec590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ec770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759ed100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29759edaf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759ec9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759c1e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759eeea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759edc10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975cb6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a17230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a3b590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9c2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9ea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a9c410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a613a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975a3a3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29759efe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2975325670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_mcm8pnww/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297536dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297536d190> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538d3d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753beb70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753be660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297538fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753bf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29753bfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753bffe0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975229df0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297522ba10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522ffb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752343e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522e330> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975237f50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975236a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29752367b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975236cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297522e840> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297527c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297527dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752802f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527e480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975283aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975280470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29752848f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975284ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975284e60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297527c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297510c3e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f297510d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975286ba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975287f50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29752867e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975115610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975116360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297510d5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751161b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975117530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2975122300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297511cd70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f297520ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29753ea900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975117290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b23f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df02f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df0620> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2975198e90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b2f30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b0a70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b05f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df35f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df2ea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974df3050> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df22d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974df37a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974e4e2a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4c2c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29751b1b80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4f5c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e4ee40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974e8e3f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e7e1b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974ea1e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974e7f3b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2974c9ee40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974c9ec00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2974c9d760> {"ansible_facts": {"ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "18", "epoch": "1727096118", "epoch_int": "1727096118", "date": "2024-09-23", "time": "08:55:18", "iso8601_micro": "2024-09-23T12:55:18.110459Z", "iso8601": "2024-09-23T12:55:18Z", "iso8601_basic": "20240923T085518110459", "iso8601_basic_short": "20240923T085518", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11701 1727096118.16513: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096118.16517: _low_level_execute_command(): starting 11701 1727096118.16520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096117.5877464-11812-18361970020600/ > /dev/null 2>&1 && sleep 0' 11701 1727096118.16525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096118.16747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096118.16753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.16821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.18760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.18878: stdout chunk (state=3): >>><<< 11701 1727096118.18882: stderr chunk (state=3): >>><<< 11701 1727096118.18885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096118.18887: handler run complete 11701 1727096118.18921: variable 'ansible_facts' from source: unknown 11701 1727096118.18974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096118.19200: variable 'ansible_facts' from source: unknown 11701 1727096118.19309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096118.19312: attempt loop complete, returning result 11701 1727096118.19315: _execute() done 11701 1727096118.19317: dumping result to json 11701 1727096118.19326: done dumping result, returning 11701 1727096118.19333: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-a05c-c957-0000000000dd] 11701 1727096118.19337: sending task result for task 0afff68d-5257-a05c-c957-0000000000dd ok: [managed_node3] 11701 1727096118.19957: no more pending results, returning what we have 11701 1727096118.19961: results queue empty 11701 1727096118.19962: checking for any_errors_fatal 11701 1727096118.19963: done checking for any_errors_fatal 11701 1727096118.19963: checking for max_fail_percentage 11701 1727096118.19965: done checking for max_fail_percentage 11701 1727096118.19965: checking to see if all hosts have failed and the running result is not ok 11701 1727096118.19966: done checking to see if all hosts have failed 11701 1727096118.19968: getting the remaining hosts for this loop 11701 1727096118.19970: done getting the remaining hosts for this loop 11701 1727096118.19973: getting the next task for host managed_node3 11701 1727096118.19983: done getting next task for host managed_node3 11701 1727096118.19986: ^ task is: TASK: Check if system is ostree 11701 1727096118.19989: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096118.19997: getting variables 11701 1727096118.19999: in VariableManager get_vars() 11701 1727096118.20028: Calling all_inventory to load vars for managed_node3 11701 1727096118.20194: Calling groups_inventory to load vars for managed_node3 11701 1727096118.20200: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096118.20212: Calling all_plugins_play to load vars for managed_node3 11701 1727096118.20215: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096118.20218: Calling groups_plugins_play to load vars for managed_node3 11701 1727096118.20638: done sending task result for task 0afff68d-5257-a05c-c957-0000000000dd 11701 1727096118.20642: WORKER PROCESS EXITING 11701 1727096118.20669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096118.21098: done with get_vars() 11701 1727096118.21110: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:55:18 -0400 (0:00:00.770) 0:00:02.178 ****** 11701 1727096118.21400: entering _queue_task() for managed_node3/stat 11701 1727096118.22040: worker is 1 (out of 1 available) 11701 1727096118.22277: exiting _queue_task() for managed_node3/stat 11701 1727096118.22288: done queuing things up, now waiting for results queue to drain 11701 1727096118.22290: waiting for pending results... 11701 1727096118.22570: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 11701 1727096118.22855: in run() - task 0afff68d-5257-a05c-c957-0000000000df 11701 1727096118.22880: variable 'ansible_search_path' from source: unknown 11701 1727096118.22891: variable 'ansible_search_path' from source: unknown 11701 1727096118.23105: calling self._execute() 11701 1727096118.23212: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096118.23228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096118.23244: variable 'omit' from source: magic vars 11701 1727096118.24424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096118.25376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096118.25634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096118.25833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096118.25942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096118.26344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096118.26381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096118.26528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096118.26719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096118.27066: Evaluated conditional (not __network_is_ostree is defined): True 11701 1727096118.27109: variable 'omit' from source: magic vars 11701 1727096118.27209: variable 'omit' from source: magic vars 11701 1727096118.27357: variable 'omit' from source: magic vars 11701 1727096118.27459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096118.27569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096118.27756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096118.27759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096118.27761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096118.27897: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096118.27905: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096118.28082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096118.28356: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096118.28372: Set connection var ansible_timeout to 10 11701 1727096118.28380: Set connection var ansible_shell_type to sh 11701 1727096118.28390: Set connection var ansible_shell_executable to /bin/sh 11701 1727096118.28573: Set connection var ansible_connection to ssh 11701 1727096118.28576: Set connection var ansible_pipelining to False 11701 1727096118.28579: variable 'ansible_shell_executable' from source: unknown 11701 1727096118.28581: variable 'ansible_connection' from source: unknown 11701 1727096118.28584: variable 'ansible_module_compression' from source: unknown 11701 1727096118.28588: variable 'ansible_shell_type' from source: unknown 11701 1727096118.28595: variable 'ansible_shell_executable' from source: unknown 11701 1727096118.28735: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096118.28738: variable 'ansible_pipelining' from source: unknown 11701 1727096118.28741: variable 'ansible_timeout' from source: unknown 11701 1727096118.28743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096118.29223: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096118.29233: variable 'omit' from source: magic vars 11701 1727096118.29374: starting attempt loop 11701 1727096118.29377: running the handler 11701 1727096118.29379: _low_level_execute_command(): starting 11701 1727096118.29381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096118.30712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096118.30860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.30991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.31065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.32699: stdout chunk (state=3): >>>/root <<< 11701 1727096118.32898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.32929: stderr chunk (state=3): >>><<< 11701 1727096118.33029: stdout chunk (state=3): >>><<< 11701 1727096118.33055: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096118.33084: _low_level_execute_command(): starting 11701 1727096118.33095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415 `" && echo ansible-tmp-1727096118.3307128-11832-91339172282415="` echo /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415 `" ) && sleep 0' 11701 1727096118.34377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096118.34582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.34701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.34866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.36863: stdout chunk (state=3): >>>ansible-tmp-1727096118.3307128-11832-91339172282415=/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415 <<< 11701 1727096118.36988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.37091: stderr chunk (state=3): >>><<< 11701 1727096118.37094: stdout chunk (state=3): >>><<< 11701 1727096118.37112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096118.3307128-11832-91339172282415=/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096118.37477: variable 'ansible_module_compression' from source: unknown 11701 1727096118.37481: ANSIBALLZ: Using lock for stat 11701 1727096118.37483: ANSIBALLZ: Acquiring lock 11701 1727096118.37486: ANSIBALLZ: Lock acquired: 139907404355616 11701 1727096118.37487: ANSIBALLZ: Creating module 11701 1727096118.57416: ANSIBALLZ: Writing module into payload 11701 1727096118.57580: ANSIBALLZ: Writing module 11701 1727096118.57715: ANSIBALLZ: Renaming module 11701 1727096118.57763: ANSIBALLZ: Done creating module 11701 1727096118.58104: variable 'ansible_facts' from source: unknown 11701 1727096118.58306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py 11701 1727096118.58783: Sending initial data 11701 1727096118.58787: Sent initial data (152 bytes) 11701 1727096118.60023: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096118.60039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096118.60076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096118.60212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096118.60318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.60579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.60637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.62331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096118.62359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096118.62445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpv83bji66 /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py <<< 11701 1727096118.62461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py" <<< 11701 1727096118.62491: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpv83bji66" to remote "/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py" <<< 11701 1727096118.64681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.64685: stdout chunk (state=3): >>><<< 11701 1727096118.64687: stderr chunk (state=3): >>><<< 11701 1727096118.64689: done transferring module to remote 11701 1727096118.64691: _low_level_execute_command(): starting 11701 1727096118.64695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/ /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py && sleep 0' 11701 1727096118.65939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096118.65964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096118.65982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096118.66006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096118.66059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096118.66184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.66295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.66499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.68433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.68547: stderr chunk (state=3): >>><<< 11701 1727096118.68556: stdout chunk (state=3): >>><<< 11701 1727096118.68812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096118.68816: _low_level_execute_command(): starting 11701 1727096118.68822: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/AnsiballZ_stat.py && sleep 0' 11701 1727096118.70220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096118.70258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096118.70287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096118.70407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096118.70424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.70439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.70542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.72871: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11701 1727096118.72880: stdout chunk (state=3): >>>import _imp # builtin <<< 11701 1727096118.72901: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11701 1727096118.72971: stdout chunk (state=3): >>>import '_io' # <<< 11701 1727096118.73009: stdout chunk (state=3): >>>import 'marshal' # <<< 11701 1727096118.73019: stdout chunk (state=3): >>>import 'posix' # <<< 11701 1727096118.73084: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11701 1727096118.73088: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11701 1727096118.73139: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11701 1727096118.73142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.73205: stdout chunk (state=3): >>>import '_codecs' # <<< 11701 1727096118.73216: stdout chunk (state=3): >>>import 'codecs' # <<< 11701 1727096118.73242: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e954184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e953e7b30> <<< 11701 1727096118.73291: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11701 1727096118.73294: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9541aa50> <<< 11701 1727096118.73350: stdout chunk (state=3): >>>import '_signal' # <<< 11701 1727096118.73353: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 11701 1727096118.73403: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11701 1727096118.73470: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11701 1727096118.73499: stdout chunk (state=3): >>>import 'genericpath' # <<< 11701 1727096118.73544: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # <<< 11701 1727096118.73636: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11701 1727096118.73640: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951c9130> <<< 11701 1727096118.73737: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11701 1727096118.73741: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951c9fa0> import 'site' # <<< 11701 1727096118.73800: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11701 1727096118.73996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11701 1727096118.74065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11701 1727096118.74159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11701 1727096118.74165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11701 1727096118.74195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95207e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11701 1727096118.74232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11701 1727096118.74255: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95207f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11701 1727096118.74510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11701 1727096118.74518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9523f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9523ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521fb60> import '_functools' # <<< 11701 1727096118.74534: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521d280> <<< 11701 1727096118.74624: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95205040> <<< 11701 1727096118.74642: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11701 1727096118.74671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11701 1727096118.74688: stdout chunk (state=3): >>>import '_sre' # <<< 11701 1727096118.74713: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11701 1727096118.74750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11701 1727096118.74837: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525f800> <<< 11701 1727096118.74850: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525cc80> <<< 11701 1727096118.74908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95294890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952042c0> <<< 11701 1727096118.74977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11701 1727096118.75005: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95294d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95294bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95294fe0> <<< 11701 1727096118.75061: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95202de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.75147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11701 1727096118.75375: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952953a0> import 'importlib.machinery' # <<< 11701 1727096118.75425: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952965d0> <<< 11701 1727096118.75429: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952adeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952aed50> <<< 11701 1727096118.75453: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952af380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952ae2a0> <<< 11701 1727096118.75478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11701 1727096118.75491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11701 1727096118.75526: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952afe00> <<< 11701 1727096118.75547: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952af530> <<< 11701 1727096118.75585: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95296570> <<< 11701 1727096118.75604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11701 1727096118.75633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11701 1727096118.75656: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11701 1727096118.75673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11701 1727096118.75708: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95037ce0> <<< 11701 1727096118.75737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11701 1727096118.75766: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95060740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950604a0> <<< 11701 1727096118.75789: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95060770> <<< 11701 1727096118.75817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11701 1727096118.75889: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.76022: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e950610a0> <<< 11701 1727096118.76174: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.76282: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95061a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95060950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95035e80> <<< 11701 1727096118.76292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11701 1727096118.76354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95062e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950618e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95296cc0> <<< 11701 1727096118.76359: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11701 1727096118.76454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11701 1727096118.76548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9508b170> <<< 11701 1727096118.76592: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11701 1727096118.76679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11701 1727096118.76688: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950af4d0> <<< 11701 1727096118.76773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11701 1727096118.76810: stdout chunk (state=3): >>>import 'ntpath' # <<< 11701 1727096118.76861: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951102f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11701 1727096118.76892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11701 1727096118.76920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11701 1727096118.77030: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95112a20> <<< 11701 1727096118.77146: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951103e0> <<< 11701 1727096118.77149: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950d52e0> <<< 11701 1727096118.77176: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f153d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950ae300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95063d40> <<< 11701 1727096118.77291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11701 1727096118.77324: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9e950ae660> <<< 11701 1727096118.77497: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_re506wic/ansible_stat_payload.zip' <<< 11701 1727096118.77533: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.77660: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11701 1727096118.77683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11701 1727096118.77786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11701 1727096118.77880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 11701 1727096118.77901: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f6b0e0> import '_typing' # <<< 11701 1727096118.78023: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f49fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f49160> <<< 11701 1727096118.78081: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.78143: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11701 1727096118.79583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.80715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 11701 1727096118.80721: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f69760> <<< 11701 1727096118.80725: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.80752: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11701 1727096118.80755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11701 1727096118.80779: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11701 1727096118.80783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11701 1727096118.80813: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.80819: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f92ab0> <<< 11701 1727096118.80856: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92870> <<< 11701 1727096118.80882: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92180> <<< 11701 1727096118.80904: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11701 1727096118.80918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11701 1727096118.80952: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92660> <<< 11701 1727096118.80955: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9541a9c0> <<< 11701 1727096118.80969: stdout chunk (state=3): >>>import 'atexit' # <<< 11701 1727096118.80993: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.80996: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f93830> <<< 11701 1727096118.81021: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.81029: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f939e0> <<< 11701 1727096118.81041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11701 1727096118.81091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11701 1727096118.81103: stdout chunk (state=3): >>>import '_locale' # <<< 11701 1727096118.81150: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f93f20> <<< 11701 1727096118.81160: stdout chunk (state=3): >>>import 'pwd' # <<< 11701 1727096118.81178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11701 1727096118.81201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11701 1727096118.81273: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94915bb0> <<< 11701 1727096118.81289: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94917860> <<< 11701 1727096118.81302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11701 1727096118.81312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11701 1727096118.81349: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94918260> <<< 11701 1727096118.81363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11701 1727096118.81396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11701 1727096118.81409: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94919130> <<< 11701 1727096118.81436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11701 1727096118.81477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11701 1727096118.81499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11701 1727096118.81555: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491be90> <<< 11701 1727096118.81632: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f4b0b0> <<< 11701 1727096118.81636: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491a150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11701 1727096118.81666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11701 1727096118.81693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11701 1727096118.81740: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11701 1727096118.81769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 11701 1727096118.81781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94923d10> import '_tokenize' # <<< 11701 1727096118.81880: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949227e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94922540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11701 1727096118.81896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11701 1727096118.81955: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94922a50> <<< 11701 1727096118.81993: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491a660> <<< 11701 1727096118.82021: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496bec0> <<< 11701 1727096118.82038: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496bfb0> <<< 11701 1727096118.82063: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11701 1727096118.82089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11701 1727096118.82105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11701 1727096118.82164: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496dac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496d880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11701 1727096118.82300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11701 1727096118.82343: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496ffb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496e180> <<< 11701 1727096118.82375: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11701 1727096118.82416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.82457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11701 1727096118.82485: stdout chunk (state=3): >>>import '_string' # <<< 11701 1727096118.82498: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94973710> <<< 11701 1727096118.82632: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949700e0> <<< 11701 1727096118.82679: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e949744d0> <<< 11701 1727096118.82709: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94974500> <<< 11701 1727096118.82762: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94974a10> <<< 11701 1727096118.82792: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496c1a0> <<< 11701 1727096118.82815: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11701 1727096118.82845: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11701 1727096118.82849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11701 1727096118.82919: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11701 1727096118.82924: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948001d0> <<< 11701 1727096118.83104: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94801400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94976960> <<< 11701 1727096118.83136: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94977d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94976570> # zipimport: zlib available <<< 11701 1727096118.83139: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11701 1727096118.83157: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.83241: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.83331: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.83360: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11701 1727096118.83389: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.83407: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11701 1727096118.83519: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.83636: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.84198: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.84844: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11701 1727096118.84848: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.84890: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948055e0> <<< 11701 1727096118.84999: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94806480> <<< 11701 1727096118.85002: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94975820> <<< 11701 1727096118.85046: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11701 1727096118.85079: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.85108: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11701 1727096118.85261: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.85426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11701 1727096118.85451: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94806330> # zipimport: zlib available <<< 11701 1727096118.85935: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86404: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86474: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86548: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11701 1727096118.86559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86589: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86644: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11701 1727096118.86649: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86724: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.86845: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11701 1727096118.86967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11701 1727096118.87289: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.87429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11701 1727096118.87583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94807530> # zipimport: zlib available <<< 11701 1727096118.87666: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.87801: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 11701 1727096118.87814: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.87922: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.87952: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.88004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.88156: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11701 1727096118.88159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11701 1727096118.88281: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948120c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9480f380> <<< 11701 1727096118.88418: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11701 1727096118.88463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.88593: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.88610: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11701 1727096118.88700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11701 1727096118.88805: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949029c0> <<< 11701 1727096118.88937: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94fca690> <<< 11701 1727096118.88990: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e948121b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e948016d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 11701 1727096118.89199: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 11701 1727096118.89268: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11701 1727096118.89332: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.89577: stdout chunk (state=3): >>># zipimport: zlib available <<< 11701 1727096118.89625: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11701 1727096118.90070: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 11701 1727096118.90109: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 11701 1727096118.90213: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11701 1727096118.90509: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 11701 1727096118.90594: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 11701 1727096118.90631: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11701 1727096118.90740: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 11701 1727096118.90816: stdout chunk (state=3): >>># destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11701 1727096118.90910: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11701 1727096118.90914: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11701 1727096118.91103: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11701 1727096118.91130: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid <<< 11701 1727096118.91170: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11701 1727096118.91217: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11701 1727096118.91304: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 11701 1727096118.91333: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 11701 1727096118.91363: stdout chunk (state=3): >>># destroy _hashlib <<< 11701 1727096118.91398: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re # destroy itertools <<< 11701 1727096118.91414: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11701 1727096118.91847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096118.91877: stderr chunk (state=3): >>><<< 11701 1727096118.91893: stdout chunk (state=3): >>><<< 11701 1727096118.92118: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e954184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e953e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9541aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95207e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95207f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9523f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9523ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95205040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9521e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9525cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95294890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95294d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95294bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95294fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95202de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952965d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952adeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952aed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952af380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952ae2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e952afe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e952af530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95296570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95037ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95060740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950604a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95060770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e950610a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e95061a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95060950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95035e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95062e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950618e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95296cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9508b170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950af4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951102f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95112a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e951103e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950d52e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f153d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e950ae300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e95063d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9e950ae660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_re506wic/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f6b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f49fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f49160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f69760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f92ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f92660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9541a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f93830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f939e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94f93f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94915bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94917860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94918260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94919130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491be90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94f4b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491a150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94923d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949227e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94922540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94922a50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9491a660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496bfb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496dac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496d880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e9496ffb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496e180> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94973710> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949700e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e949744d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94974500> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94974a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9496c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948001d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94801400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94976960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e94977d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94976570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948055e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94806480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94975820> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94806330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94807530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9e948120c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e9480f380> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e949029c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e94fca690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e948121b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9e948016d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11701 1727096118.93145: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096118.93147: _low_level_execute_command(): starting 11701 1727096118.93217: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096118.3307128-11832-91339172282415/ > /dev/null 2>&1 && sleep 0' 11701 1727096118.94011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096118.94152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096118.94282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096118.96242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096118.96247: stdout chunk (state=3): >>><<< 11701 1727096118.96249: stderr chunk (state=3): >>><<< 11701 1727096118.96494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096118.96498: handler run complete 11701 1727096118.96501: attempt loop complete, returning result 11701 1727096118.96502: _execute() done 11701 1727096118.96504: dumping result to json 11701 1727096118.96506: done dumping result, returning 11701 1727096118.96507: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0afff68d-5257-a05c-c957-0000000000df] 11701 1727096118.96509: sending task result for task 0afff68d-5257-a05c-c957-0000000000df 11701 1727096118.96580: done sending task result for task 0afff68d-5257-a05c-c957-0000000000df 11701 1727096118.96583: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11701 1727096118.96639: no more pending results, returning what we have 11701 1727096118.96642: results queue empty 11701 1727096118.96643: checking for any_errors_fatal 11701 1727096118.96648: done checking for any_errors_fatal 11701 1727096118.96651: checking for max_fail_percentage 11701 1727096118.96653: done checking for max_fail_percentage 11701 1727096118.96654: checking to see if all hosts have failed and the running result is not ok 11701 1727096118.96654: done checking to see if all hosts have failed 11701 1727096118.96655: getting the remaining hosts for this loop 11701 1727096118.96656: done getting the remaining hosts for this loop 11701 1727096118.96659: getting the next task for host managed_node3 11701 1727096118.96664: done getting next task for host managed_node3 11701 1727096118.96667: ^ task is: TASK: Set flag to indicate system is ostree 11701 1727096118.96671: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096118.96674: getting variables 11701 1727096118.96675: in VariableManager get_vars() 11701 1727096118.96715: Calling all_inventory to load vars for managed_node3 11701 1727096118.96718: Calling groups_inventory to load vars for managed_node3 11701 1727096118.96721: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096118.96731: Calling all_plugins_play to load vars for managed_node3 11701 1727096118.96734: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096118.96737: Calling groups_plugins_play to load vars for managed_node3 11701 1727096118.97024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096118.97405: done with get_vars() 11701 1727096118.97419: done getting variables 11701 1727096118.97530: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:55:18 -0400 (0:00:00.761) 0:00:02.940 ****** 11701 1727096118.97561: entering _queue_task() for managed_node3/set_fact 11701 1727096118.97563: Creating lock for set_fact 11701 1727096118.98070: worker is 1 (out of 1 available) 11701 1727096118.98084: exiting _queue_task() for managed_node3/set_fact 11701 1727096118.98096: done queuing things up, now waiting for results queue to drain 11701 1727096118.98097: waiting for pending results... 11701 1727096118.98473: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 11701 1727096118.98480: in run() - task 0afff68d-5257-a05c-c957-0000000000e0 11701 1727096118.98484: variable 'ansible_search_path' from source: unknown 11701 1727096118.98486: variable 'ansible_search_path' from source: unknown 11701 1727096118.98489: calling self._execute() 11701 1727096118.98606: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096118.98610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096118.98613: variable 'omit' from source: magic vars 11701 1727096118.99111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096118.99340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096118.99372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096118.99404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096118.99437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096118.99518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096118.99582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096118.99586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096118.99589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096118.99707: Evaluated conditional (not __network_is_ostree is defined): True 11701 1727096118.99713: variable 'omit' from source: magic vars 11701 1727096118.99754: variable 'omit' from source: magic vars 11701 1727096118.99911: variable '__ostree_booted_stat' from source: set_fact 11701 1727096118.99915: variable 'omit' from source: magic vars 11701 1727096118.99939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096118.99966: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096118.99988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096119.00024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.00027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.00038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096119.00041: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.00043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.00197: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096119.00200: Set connection var ansible_timeout to 10 11701 1727096119.00205: Set connection var ansible_shell_type to sh 11701 1727096119.00208: Set connection var ansible_shell_executable to /bin/sh 11701 1727096119.00210: Set connection var ansible_connection to ssh 11701 1727096119.00213: Set connection var ansible_pipelining to False 11701 1727096119.00215: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.00217: variable 'ansible_connection' from source: unknown 11701 1727096119.00219: variable 'ansible_module_compression' from source: unknown 11701 1727096119.00221: variable 'ansible_shell_type' from source: unknown 11701 1727096119.00223: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.00226: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.00228: variable 'ansible_pipelining' from source: unknown 11701 1727096119.00230: variable 'ansible_timeout' from source: unknown 11701 1727096119.00231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.00379: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096119.00382: variable 'omit' from source: magic vars 11701 1727096119.00384: starting attempt loop 11701 1727096119.00387: running the handler 11701 1727096119.00389: handler run complete 11701 1727096119.00391: attempt loop complete, returning result 11701 1727096119.00393: _execute() done 11701 1727096119.00395: dumping result to json 11701 1727096119.00397: done dumping result, returning 11701 1727096119.00399: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0afff68d-5257-a05c-c957-0000000000e0] 11701 1727096119.00402: sending task result for task 0afff68d-5257-a05c-c957-0000000000e0 ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11701 1727096119.00581: no more pending results, returning what we have 11701 1727096119.00586: results queue empty 11701 1727096119.00587: checking for any_errors_fatal 11701 1727096119.00592: done checking for any_errors_fatal 11701 1727096119.00593: checking for max_fail_percentage 11701 1727096119.00596: done checking for max_fail_percentage 11701 1727096119.00596: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.00597: done checking to see if all hosts have failed 11701 1727096119.00598: getting the remaining hosts for this loop 11701 1727096119.00600: done getting the remaining hosts for this loop 11701 1727096119.00603: getting the next task for host managed_node3 11701 1727096119.00611: done getting next task for host managed_node3 11701 1727096119.00613: ^ task is: TASK: Fix CentOS6 Base repo 11701 1727096119.00616: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.00625: getting variables 11701 1727096119.00628: in VariableManager get_vars() 11701 1727096119.00659: Calling all_inventory to load vars for managed_node3 11701 1727096119.00661: Calling groups_inventory to load vars for managed_node3 11701 1727096119.00664: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.00676: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.00678: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.00680: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.00959: done sending task result for task 0afff68d-5257-a05c-c957-0000000000e0 11701 1727096119.00969: WORKER PROCESS EXITING 11701 1727096119.00990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.01237: done with get_vars() 11701 1727096119.01253: done getting variables 11701 1727096119.01627: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:55:19 -0400 (0:00:00.040) 0:00:02.981 ****** 11701 1727096119.01656: entering _queue_task() for managed_node3/copy 11701 1727096119.01921: worker is 1 (out of 1 available) 11701 1727096119.01933: exiting _queue_task() for managed_node3/copy 11701 1727096119.01944: done queuing things up, now waiting for results queue to drain 11701 1727096119.01946: waiting for pending results... 11701 1727096119.02096: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 11701 1727096119.02181: in run() - task 0afff68d-5257-a05c-c957-0000000000e2 11701 1727096119.02195: variable 'ansible_search_path' from source: unknown 11701 1727096119.02198: variable 'ansible_search_path' from source: unknown 11701 1727096119.02226: calling self._execute() 11701 1727096119.02287: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.02293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.02303: variable 'omit' from source: magic vars 11701 1727096119.02716: variable 'ansible_distribution' from source: facts 11701 1727096119.02735: Evaluated conditional (ansible_distribution == 'CentOS'): True 11701 1727096119.02828: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.02833: Evaluated conditional (ansible_distribution_major_version == '6'): False 11701 1727096119.02837: when evaluation is False, skipping this task 11701 1727096119.02839: _execute() done 11701 1727096119.02843: dumping result to json 11701 1727096119.02845: done dumping result, returning 11701 1727096119.02913: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0afff68d-5257-a05c-c957-0000000000e2] 11701 1727096119.02916: sending task result for task 0afff68d-5257-a05c-c957-0000000000e2 11701 1727096119.02998: done sending task result for task 0afff68d-5257-a05c-c957-0000000000e2 11701 1727096119.03001: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11701 1727096119.03100: no more pending results, returning what we have 11701 1727096119.03103: results queue empty 11701 1727096119.03104: checking for any_errors_fatal 11701 1727096119.03108: done checking for any_errors_fatal 11701 1727096119.03108: checking for max_fail_percentage 11701 1727096119.03110: done checking for max_fail_percentage 11701 1727096119.03110: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.03111: done checking to see if all hosts have failed 11701 1727096119.03112: getting the remaining hosts for this loop 11701 1727096119.03113: done getting the remaining hosts for this loop 11701 1727096119.03116: getting the next task for host managed_node3 11701 1727096119.03122: done getting next task for host managed_node3 11701 1727096119.03124: ^ task is: TASK: Include the task 'enable_epel.yml' 11701 1727096119.03127: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.03130: getting variables 11701 1727096119.03131: in VariableManager get_vars() 11701 1727096119.03160: Calling all_inventory to load vars for managed_node3 11701 1727096119.03162: Calling groups_inventory to load vars for managed_node3 11701 1727096119.03165: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.03179: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.03182: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.03187: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.03336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.03529: done with get_vars() 11701 1727096119.03541: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:55:19 -0400 (0:00:00.019) 0:00:03.000 ****** 11701 1727096119.03625: entering _queue_task() for managed_node3/include_tasks 11701 1727096119.03963: worker is 1 (out of 1 available) 11701 1727096119.04122: exiting _queue_task() for managed_node3/include_tasks 11701 1727096119.04135: done queuing things up, now waiting for results queue to drain 11701 1727096119.04136: waiting for pending results... 11701 1727096119.04262: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 11701 1727096119.04373: in run() - task 0afff68d-5257-a05c-c957-0000000000e3 11701 1727096119.04377: variable 'ansible_search_path' from source: unknown 11701 1727096119.04381: variable 'ansible_search_path' from source: unknown 11701 1727096119.04417: calling self._execute() 11701 1727096119.04546: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.04561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.04674: variable 'omit' from source: magic vars 11701 1727096119.05728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096119.07441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096119.07491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096119.07520: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096119.07545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096119.07570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096119.07657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096119.07733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096119.07736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096119.07739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096119.07773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096119.07911: variable '__network_is_ostree' from source: set_fact 11701 1727096119.07934: Evaluated conditional (not __network_is_ostree | d(false)): True 11701 1727096119.07946: _execute() done 11701 1727096119.07957: dumping result to json 11701 1727096119.07965: done dumping result, returning 11701 1727096119.07983: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-a05c-c957-0000000000e3] 11701 1727096119.07994: sending task result for task 0afff68d-5257-a05c-c957-0000000000e3 11701 1727096119.08159: no more pending results, returning what we have 11701 1727096119.08165: in VariableManager get_vars() 11701 1727096119.08201: Calling all_inventory to load vars for managed_node3 11701 1727096119.08203: Calling groups_inventory to load vars for managed_node3 11701 1727096119.08207: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.08218: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.08221: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.08223: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.08445: done sending task result for task 0afff68d-5257-a05c-c957-0000000000e3 11701 1727096119.08451: WORKER PROCESS EXITING 11701 1727096119.08472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.08657: done with get_vars() 11701 1727096119.08668: variable 'ansible_search_path' from source: unknown 11701 1727096119.08670: variable 'ansible_search_path' from source: unknown 11701 1727096119.08706: we have included files to process 11701 1727096119.08707: generating all_blocks data 11701 1727096119.08710: done generating all_blocks data 11701 1727096119.08715: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11701 1727096119.08716: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11701 1727096119.08718: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11701 1727096119.09339: done processing included file 11701 1727096119.09341: iterating over new_blocks loaded from include file 11701 1727096119.09342: in VariableManager get_vars() 11701 1727096119.09353: done with get_vars() 11701 1727096119.09354: filtering new block on tags 11701 1727096119.09374: done filtering new block on tags 11701 1727096119.09376: in VariableManager get_vars() 11701 1727096119.09384: done with get_vars() 11701 1727096119.09385: filtering new block on tags 11701 1727096119.09391: done filtering new block on tags 11701 1727096119.09392: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 11701 1727096119.09397: extending task lists for all hosts with included blocks 11701 1727096119.09464: done extending task lists 11701 1727096119.09465: done processing included files 11701 1727096119.09466: results queue empty 11701 1727096119.09467: checking for any_errors_fatal 11701 1727096119.09472: done checking for any_errors_fatal 11701 1727096119.09473: checking for max_fail_percentage 11701 1727096119.09474: done checking for max_fail_percentage 11701 1727096119.09474: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.09475: done checking to see if all hosts have failed 11701 1727096119.09475: getting the remaining hosts for this loop 11701 1727096119.09476: done getting the remaining hosts for this loop 11701 1727096119.09478: getting the next task for host managed_node3 11701 1727096119.09481: done getting next task for host managed_node3 11701 1727096119.09483: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11701 1727096119.09485: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.09486: getting variables 11701 1727096119.09487: in VariableManager get_vars() 11701 1727096119.09493: Calling all_inventory to load vars for managed_node3 11701 1727096119.09494: Calling groups_inventory to load vars for managed_node3 11701 1727096119.09496: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.09500: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.09506: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.09508: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.09609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.09720: done with get_vars() 11701 1727096119.09726: done getting variables 11701 1727096119.09779: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 11701 1727096119.09926: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:55:19 -0400 (0:00:00.063) 0:00:03.064 ****** 11701 1727096119.09962: entering _queue_task() for managed_node3/command 11701 1727096119.09963: Creating lock for command 11701 1727096119.10215: worker is 1 (out of 1 available) 11701 1727096119.10227: exiting _queue_task() for managed_node3/command 11701 1727096119.10237: done queuing things up, now waiting for results queue to drain 11701 1727096119.10239: waiting for pending results... 11701 1727096119.10394: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 11701 1727096119.10458: in run() - task 0afff68d-5257-a05c-c957-0000000000fd 11701 1727096119.10473: variable 'ansible_search_path' from source: unknown 11701 1727096119.10476: variable 'ansible_search_path' from source: unknown 11701 1727096119.10506: calling self._execute() 11701 1727096119.10562: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.10566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.10579: variable 'omit' from source: magic vars 11701 1727096119.10857: variable 'ansible_distribution' from source: facts 11701 1727096119.10866: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11701 1727096119.10960: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.10964: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11701 1727096119.10967: when evaluation is False, skipping this task 11701 1727096119.10971: _execute() done 11701 1727096119.10976: dumping result to json 11701 1727096119.10978: done dumping result, returning 11701 1727096119.10986: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0afff68d-5257-a05c-c957-0000000000fd] 11701 1727096119.10988: sending task result for task 0afff68d-5257-a05c-c957-0000000000fd 11701 1727096119.11091: done sending task result for task 0afff68d-5257-a05c-c957-0000000000fd 11701 1727096119.11093: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11701 1727096119.11170: no more pending results, returning what we have 11701 1727096119.11174: results queue empty 11701 1727096119.11175: checking for any_errors_fatal 11701 1727096119.11176: done checking for any_errors_fatal 11701 1727096119.11177: checking for max_fail_percentage 11701 1727096119.11178: done checking for max_fail_percentage 11701 1727096119.11179: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.11180: done checking to see if all hosts have failed 11701 1727096119.11181: getting the remaining hosts for this loop 11701 1727096119.11182: done getting the remaining hosts for this loop 11701 1727096119.11186: getting the next task for host managed_node3 11701 1727096119.11191: done getting next task for host managed_node3 11701 1727096119.11194: ^ task is: TASK: Install yum-utils package 11701 1727096119.11197: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.11200: getting variables 11701 1727096119.11203: in VariableManager get_vars() 11701 1727096119.11231: Calling all_inventory to load vars for managed_node3 11701 1727096119.11233: Calling groups_inventory to load vars for managed_node3 11701 1727096119.11236: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.11246: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.11248: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.11253: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.11378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.11495: done with get_vars() 11701 1727096119.11503: done getting variables 11701 1727096119.11579: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:55:19 -0400 (0:00:00.016) 0:00:03.080 ****** 11701 1727096119.11600: entering _queue_task() for managed_node3/package 11701 1727096119.11601: Creating lock for package 11701 1727096119.11825: worker is 1 (out of 1 available) 11701 1727096119.11838: exiting _queue_task() for managed_node3/package 11701 1727096119.11848: done queuing things up, now waiting for results queue to drain 11701 1727096119.11850: waiting for pending results... 11701 1727096119.12006: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 11701 1727096119.12076: in run() - task 0afff68d-5257-a05c-c957-0000000000fe 11701 1727096119.12090: variable 'ansible_search_path' from source: unknown 11701 1727096119.12093: variable 'ansible_search_path' from source: unknown 11701 1727096119.12119: calling self._execute() 11701 1727096119.12236: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.12240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.12249: variable 'omit' from source: magic vars 11701 1727096119.12510: variable 'ansible_distribution' from source: facts 11701 1727096119.12523: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11701 1727096119.12612: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.12619: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11701 1727096119.12622: when evaluation is False, skipping this task 11701 1727096119.12626: _execute() done 11701 1727096119.12631: dumping result to json 11701 1727096119.12634: done dumping result, returning 11701 1727096119.12640: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0afff68d-5257-a05c-c957-0000000000fe] 11701 1727096119.12646: sending task result for task 0afff68d-5257-a05c-c957-0000000000fe 11701 1727096119.12734: done sending task result for task 0afff68d-5257-a05c-c957-0000000000fe 11701 1727096119.12738: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11701 1727096119.12838: no more pending results, returning what we have 11701 1727096119.12841: results queue empty 11701 1727096119.12842: checking for any_errors_fatal 11701 1727096119.12846: done checking for any_errors_fatal 11701 1727096119.12846: checking for max_fail_percentage 11701 1727096119.12848: done checking for max_fail_percentage 11701 1727096119.12851: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.12852: done checking to see if all hosts have failed 11701 1727096119.12853: getting the remaining hosts for this loop 11701 1727096119.12854: done getting the remaining hosts for this loop 11701 1727096119.12857: getting the next task for host managed_node3 11701 1727096119.12862: done getting next task for host managed_node3 11701 1727096119.12864: ^ task is: TASK: Enable EPEL 7 11701 1727096119.12869: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.12872: getting variables 11701 1727096119.12873: in VariableManager get_vars() 11701 1727096119.12894: Calling all_inventory to load vars for managed_node3 11701 1727096119.12896: Calling groups_inventory to load vars for managed_node3 11701 1727096119.12899: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.12908: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.12911: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.12913: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.13024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.13143: done with get_vars() 11701 1727096119.13151: done getting variables 11701 1727096119.13195: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:55:19 -0400 (0:00:00.016) 0:00:03.096 ****** 11701 1727096119.13216: entering _queue_task() for managed_node3/command 11701 1727096119.13434: worker is 1 (out of 1 available) 11701 1727096119.13446: exiting _queue_task() for managed_node3/command 11701 1727096119.13457: done queuing things up, now waiting for results queue to drain 11701 1727096119.13458: waiting for pending results... 11701 1727096119.13625: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 11701 1727096119.13696: in run() - task 0afff68d-5257-a05c-c957-0000000000ff 11701 1727096119.13712: variable 'ansible_search_path' from source: unknown 11701 1727096119.13716: variable 'ansible_search_path' from source: unknown 11701 1727096119.13740: calling self._execute() 11701 1727096119.13802: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.13805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.13819: variable 'omit' from source: magic vars 11701 1727096119.14097: variable 'ansible_distribution' from source: facts 11701 1727096119.14108: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11701 1727096119.14203: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.14206: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11701 1727096119.14209: when evaluation is False, skipping this task 11701 1727096119.14212: _execute() done 11701 1727096119.14215: dumping result to json 11701 1727096119.14218: done dumping result, returning 11701 1727096119.14228: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0afff68d-5257-a05c-c957-0000000000ff] 11701 1727096119.14230: sending task result for task 0afff68d-5257-a05c-c957-0000000000ff 11701 1727096119.14317: done sending task result for task 0afff68d-5257-a05c-c957-0000000000ff 11701 1727096119.14320: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11701 1727096119.14389: no more pending results, returning what we have 11701 1727096119.14393: results queue empty 11701 1727096119.14394: checking for any_errors_fatal 11701 1727096119.14399: done checking for any_errors_fatal 11701 1727096119.14400: checking for max_fail_percentage 11701 1727096119.14401: done checking for max_fail_percentage 11701 1727096119.14402: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.14403: done checking to see if all hosts have failed 11701 1727096119.14404: getting the remaining hosts for this loop 11701 1727096119.14405: done getting the remaining hosts for this loop 11701 1727096119.14409: getting the next task for host managed_node3 11701 1727096119.14415: done getting next task for host managed_node3 11701 1727096119.14417: ^ task is: TASK: Enable EPEL 8 11701 1727096119.14420: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.14423: getting variables 11701 1727096119.14425: in VariableManager get_vars() 11701 1727096119.14453: Calling all_inventory to load vars for managed_node3 11701 1727096119.14455: Calling groups_inventory to load vars for managed_node3 11701 1727096119.14458: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.14470: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.14472: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.14475: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.14602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.14738: done with get_vars() 11701 1727096119.14747: done getting variables 11701 1727096119.14792: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:55:19 -0400 (0:00:00.015) 0:00:03.112 ****** 11701 1727096119.14812: entering _queue_task() for managed_node3/command 11701 1727096119.15029: worker is 1 (out of 1 available) 11701 1727096119.15041: exiting _queue_task() for managed_node3/command 11701 1727096119.15054: done queuing things up, now waiting for results queue to drain 11701 1727096119.15056: waiting for pending results... 11701 1727096119.15207: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 11701 1727096119.15272: in run() - task 0afff68d-5257-a05c-c957-000000000100 11701 1727096119.15284: variable 'ansible_search_path' from source: unknown 11701 1727096119.15289: variable 'ansible_search_path' from source: unknown 11701 1727096119.15318: calling self._execute() 11701 1727096119.15372: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.15378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.15388: variable 'omit' from source: magic vars 11701 1727096119.15663: variable 'ansible_distribution' from source: facts 11701 1727096119.15676: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11701 1727096119.15763: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.15767: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11701 1727096119.15771: when evaluation is False, skipping this task 11701 1727096119.15774: _execute() done 11701 1727096119.15778: dumping result to json 11701 1727096119.15781: done dumping result, returning 11701 1727096119.15787: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0afff68d-5257-a05c-c957-000000000100] 11701 1727096119.15792: sending task result for task 0afff68d-5257-a05c-c957-000000000100 11701 1727096119.15877: done sending task result for task 0afff68d-5257-a05c-c957-000000000100 11701 1727096119.15879: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11701 1727096119.15922: no more pending results, returning what we have 11701 1727096119.15926: results queue empty 11701 1727096119.15927: checking for any_errors_fatal 11701 1727096119.15933: done checking for any_errors_fatal 11701 1727096119.15934: checking for max_fail_percentage 11701 1727096119.15936: done checking for max_fail_percentage 11701 1727096119.15937: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.15938: done checking to see if all hosts have failed 11701 1727096119.15938: getting the remaining hosts for this loop 11701 1727096119.15939: done getting the remaining hosts for this loop 11701 1727096119.15942: getting the next task for host managed_node3 11701 1727096119.15953: done getting next task for host managed_node3 11701 1727096119.15955: ^ task is: TASK: Enable EPEL 6 11701 1727096119.15959: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.15963: getting variables 11701 1727096119.15964: in VariableManager get_vars() 11701 1727096119.16000: Calling all_inventory to load vars for managed_node3 11701 1727096119.16003: Calling groups_inventory to load vars for managed_node3 11701 1727096119.16006: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.16015: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.16018: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.16020: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.16149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.16266: done with get_vars() 11701 1727096119.16276: done getting variables 11701 1727096119.16320: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:55:19 -0400 (0:00:00.015) 0:00:03.128 ****** 11701 1727096119.16340: entering _queue_task() for managed_node3/copy 11701 1727096119.16548: worker is 1 (out of 1 available) 11701 1727096119.16561: exiting _queue_task() for managed_node3/copy 11701 1727096119.16574: done queuing things up, now waiting for results queue to drain 11701 1727096119.16576: waiting for pending results... 11701 1727096119.16731: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 11701 1727096119.16800: in run() - task 0afff68d-5257-a05c-c957-000000000102 11701 1727096119.16817: variable 'ansible_search_path' from source: unknown 11701 1727096119.16820: variable 'ansible_search_path' from source: unknown 11701 1727096119.16842: calling self._execute() 11701 1727096119.16900: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.16904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.16915: variable 'omit' from source: magic vars 11701 1727096119.17193: variable 'ansible_distribution' from source: facts 11701 1727096119.17202: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11701 1727096119.17285: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.17288: Evaluated conditional (ansible_distribution_major_version == '6'): False 11701 1727096119.17291: when evaluation is False, skipping this task 11701 1727096119.17294: _execute() done 11701 1727096119.17297: dumping result to json 11701 1727096119.17300: done dumping result, returning 11701 1727096119.17307: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0afff68d-5257-a05c-c957-000000000102] 11701 1727096119.17311: sending task result for task 0afff68d-5257-a05c-c957-000000000102 11701 1727096119.17399: done sending task result for task 0afff68d-5257-a05c-c957-000000000102 11701 1727096119.17402: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11701 1727096119.17445: no more pending results, returning what we have 11701 1727096119.17448: results queue empty 11701 1727096119.17449: checking for any_errors_fatal 11701 1727096119.17454: done checking for any_errors_fatal 11701 1727096119.17455: checking for max_fail_percentage 11701 1727096119.17456: done checking for max_fail_percentage 11701 1727096119.17457: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.17458: done checking to see if all hosts have failed 11701 1727096119.17459: getting the remaining hosts for this loop 11701 1727096119.17460: done getting the remaining hosts for this loop 11701 1727096119.17463: getting the next task for host managed_node3 11701 1727096119.17473: done getting next task for host managed_node3 11701 1727096119.17476: ^ task is: TASK: Set network provider to 'nm' 11701 1727096119.17478: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.17482: getting variables 11701 1727096119.17484: in VariableManager get_vars() 11701 1727096119.17514: Calling all_inventory to load vars for managed_node3 11701 1727096119.17516: Calling groups_inventory to load vars for managed_node3 11701 1727096119.17521: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.17531: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.17533: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.17536: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.17846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.17958: done with get_vars() 11701 1727096119.17966: done getting variables 11701 1727096119.18010: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Monday 23 September 2024 08:55:19 -0400 (0:00:00.016) 0:00:03.144 ****** 11701 1727096119.18030: entering _queue_task() for managed_node3/set_fact 11701 1727096119.18261: worker is 1 (out of 1 available) 11701 1727096119.18276: exiting _queue_task() for managed_node3/set_fact 11701 1727096119.18285: done queuing things up, now waiting for results queue to drain 11701 1727096119.18286: waiting for pending results... 11701 1727096119.18441: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 11701 1727096119.18503: in run() - task 0afff68d-5257-a05c-c957-000000000007 11701 1727096119.18517: variable 'ansible_search_path' from source: unknown 11701 1727096119.18559: calling self._execute() 11701 1727096119.18618: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.18626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.18638: variable 'omit' from source: magic vars 11701 1727096119.18714: variable 'omit' from source: magic vars 11701 1727096119.18746: variable 'omit' from source: magic vars 11701 1727096119.18784: variable 'omit' from source: magic vars 11701 1727096119.18820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096119.18852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096119.18872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096119.18885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.18895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.18919: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096119.18922: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.18925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.19003: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096119.19006: Set connection var ansible_timeout to 10 11701 1727096119.19008: Set connection var ansible_shell_type to sh 11701 1727096119.19014: Set connection var ansible_shell_executable to /bin/sh 11701 1727096119.19017: Set connection var ansible_connection to ssh 11701 1727096119.19024: Set connection var ansible_pipelining to False 11701 1727096119.19041: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.19044: variable 'ansible_connection' from source: unknown 11701 1727096119.19046: variable 'ansible_module_compression' from source: unknown 11701 1727096119.19049: variable 'ansible_shell_type' from source: unknown 11701 1727096119.19051: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.19056: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.19061: variable 'ansible_pipelining' from source: unknown 11701 1727096119.19065: variable 'ansible_timeout' from source: unknown 11701 1727096119.19069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.19173: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096119.19192: variable 'omit' from source: magic vars 11701 1727096119.19195: starting attempt loop 11701 1727096119.19198: running the handler 11701 1727096119.19201: handler run complete 11701 1727096119.19208: attempt loop complete, returning result 11701 1727096119.19210: _execute() done 11701 1727096119.19213: dumping result to json 11701 1727096119.19215: done dumping result, returning 11701 1727096119.19222: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0afff68d-5257-a05c-c957-000000000007] 11701 1727096119.19228: sending task result for task 0afff68d-5257-a05c-c957-000000000007 11701 1727096119.19309: done sending task result for task 0afff68d-5257-a05c-c957-000000000007 11701 1727096119.19312: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11701 1727096119.19370: no more pending results, returning what we have 11701 1727096119.19373: results queue empty 11701 1727096119.19374: checking for any_errors_fatal 11701 1727096119.19381: done checking for any_errors_fatal 11701 1727096119.19382: checking for max_fail_percentage 11701 1727096119.19384: done checking for max_fail_percentage 11701 1727096119.19384: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.19385: done checking to see if all hosts have failed 11701 1727096119.19386: getting the remaining hosts for this loop 11701 1727096119.19387: done getting the remaining hosts for this loop 11701 1727096119.19390: getting the next task for host managed_node3 11701 1727096119.19397: done getting next task for host managed_node3 11701 1727096119.19399: ^ task is: TASK: meta (flush_handlers) 11701 1727096119.19400: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.19404: getting variables 11701 1727096119.19407: in VariableManager get_vars() 11701 1727096119.19439: Calling all_inventory to load vars for managed_node3 11701 1727096119.19441: Calling groups_inventory to load vars for managed_node3 11701 1727096119.19445: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.19455: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.19458: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.19460: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.19601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.19716: done with get_vars() 11701 1727096119.19724: done getting variables 11701 1727096119.19780: in VariableManager get_vars() 11701 1727096119.19788: Calling all_inventory to load vars for managed_node3 11701 1727096119.19789: Calling groups_inventory to load vars for managed_node3 11701 1727096119.19791: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.19794: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.19795: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.19797: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.19881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.20012: done with get_vars() 11701 1727096119.20022: done queuing things up, now waiting for results queue to drain 11701 1727096119.20024: results queue empty 11701 1727096119.20024: checking for any_errors_fatal 11701 1727096119.20026: done checking for any_errors_fatal 11701 1727096119.20026: checking for max_fail_percentage 11701 1727096119.20027: done checking for max_fail_percentage 11701 1727096119.20027: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.20028: done checking to see if all hosts have failed 11701 1727096119.20028: getting the remaining hosts for this loop 11701 1727096119.20029: done getting the remaining hosts for this loop 11701 1727096119.20031: getting the next task for host managed_node3 11701 1727096119.20034: done getting next task for host managed_node3 11701 1727096119.20035: ^ task is: TASK: meta (flush_handlers) 11701 1727096119.20036: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.20042: getting variables 11701 1727096119.20043: in VariableManager get_vars() 11701 1727096119.20048: Calling all_inventory to load vars for managed_node3 11701 1727096119.20052: Calling groups_inventory to load vars for managed_node3 11701 1727096119.20053: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.20057: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.20058: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.20060: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.20143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.20251: done with get_vars() 11701 1727096119.20257: done getting variables 11701 1727096119.20292: in VariableManager get_vars() 11701 1727096119.20298: Calling all_inventory to load vars for managed_node3 11701 1727096119.20299: Calling groups_inventory to load vars for managed_node3 11701 1727096119.20301: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.20305: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.20307: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.20310: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.20390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.20513: done with get_vars() 11701 1727096119.20522: done queuing things up, now waiting for results queue to drain 11701 1727096119.20523: results queue empty 11701 1727096119.20524: checking for any_errors_fatal 11701 1727096119.20525: done checking for any_errors_fatal 11701 1727096119.20526: checking for max_fail_percentage 11701 1727096119.20527: done checking for max_fail_percentage 11701 1727096119.20528: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.20528: done checking to see if all hosts have failed 11701 1727096119.20529: getting the remaining hosts for this loop 11701 1727096119.20529: done getting the remaining hosts for this loop 11701 1727096119.20531: getting the next task for host managed_node3 11701 1727096119.20533: done getting next task for host managed_node3 11701 1727096119.20534: ^ task is: None 11701 1727096119.20535: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.20536: done queuing things up, now waiting for results queue to drain 11701 1727096119.20536: results queue empty 11701 1727096119.20536: checking for any_errors_fatal 11701 1727096119.20537: done checking for any_errors_fatal 11701 1727096119.20537: checking for max_fail_percentage 11701 1727096119.20538: done checking for max_fail_percentage 11701 1727096119.20538: checking to see if all hosts have failed and the running result is not ok 11701 1727096119.20539: done checking to see if all hosts have failed 11701 1727096119.20540: getting the next task for host managed_node3 11701 1727096119.20541: done getting next task for host managed_node3 11701 1727096119.20542: ^ task is: None 11701 1727096119.20542: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.20587: in VariableManager get_vars() 11701 1727096119.20604: done with get_vars() 11701 1727096119.20608: in VariableManager get_vars() 11701 1727096119.20616: done with get_vars() 11701 1727096119.20619: variable 'omit' from source: magic vars 11701 1727096119.20645: in VariableManager get_vars() 11701 1727096119.20658: done with get_vars() 11701 1727096119.20675: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 11701 1727096119.21144: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11701 1727096119.21177: getting the remaining hosts for this loop 11701 1727096119.21178: done getting the remaining hosts for this loop 11701 1727096119.21180: getting the next task for host managed_node3 11701 1727096119.21183: done getting next task for host managed_node3 11701 1727096119.21185: ^ task is: TASK: Gathering Facts 11701 1727096119.21186: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096119.21188: getting variables 11701 1727096119.21189: in VariableManager get_vars() 11701 1727096119.21202: Calling all_inventory to load vars for managed_node3 11701 1727096119.21205: Calling groups_inventory to load vars for managed_node3 11701 1727096119.21207: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096119.21212: Calling all_plugins_play to load vars for managed_node3 11701 1727096119.21225: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096119.21228: Calling groups_plugins_play to load vars for managed_node3 11701 1727096119.21366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096119.21545: done with get_vars() 11701 1727096119.21556: done getting variables 11701 1727096119.21603: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Monday 23 September 2024 08:55:19 -0400 (0:00:00.035) 0:00:03.180 ****** 11701 1727096119.21626: entering _queue_task() for managed_node3/gather_facts 11701 1727096119.21932: worker is 1 (out of 1 available) 11701 1727096119.21945: exiting _queue_task() for managed_node3/gather_facts 11701 1727096119.21960: done queuing things up, now waiting for results queue to drain 11701 1727096119.21962: waiting for pending results... 11701 1727096119.22236: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11701 1727096119.22302: in run() - task 0afff68d-5257-a05c-c957-000000000128 11701 1727096119.22315: variable 'ansible_search_path' from source: unknown 11701 1727096119.22351: calling self._execute() 11701 1727096119.22422: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.22428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.22436: variable 'omit' from source: magic vars 11701 1727096119.22726: variable 'ansible_distribution_major_version' from source: facts 11701 1727096119.22736: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096119.22742: variable 'omit' from source: magic vars 11701 1727096119.22765: variable 'omit' from source: magic vars 11701 1727096119.22796: variable 'omit' from source: magic vars 11701 1727096119.22830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096119.22861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096119.22881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096119.22896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.22908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096119.22933: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096119.22936: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.22939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.23013: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096119.23017: Set connection var ansible_timeout to 10 11701 1727096119.23020: Set connection var ansible_shell_type to sh 11701 1727096119.23026: Set connection var ansible_shell_executable to /bin/sh 11701 1727096119.23028: Set connection var ansible_connection to ssh 11701 1727096119.23037: Set connection var ansible_pipelining to False 11701 1727096119.23056: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.23059: variable 'ansible_connection' from source: unknown 11701 1727096119.23062: variable 'ansible_module_compression' from source: unknown 11701 1727096119.23064: variable 'ansible_shell_type' from source: unknown 11701 1727096119.23067: variable 'ansible_shell_executable' from source: unknown 11701 1727096119.23070: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096119.23075: variable 'ansible_pipelining' from source: unknown 11701 1727096119.23077: variable 'ansible_timeout' from source: unknown 11701 1727096119.23081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096119.23222: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096119.23225: variable 'omit' from source: magic vars 11701 1727096119.23229: starting attempt loop 11701 1727096119.23232: running the handler 11701 1727096119.23245: variable 'ansible_facts' from source: unknown 11701 1727096119.23264: _low_level_execute_command(): starting 11701 1727096119.23273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096119.23804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096119.23809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096119.23812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096119.23863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096119.23872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096119.23874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096119.23914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096119.25599: stdout chunk (state=3): >>>/root <<< 11701 1727096119.25702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096119.25733: stderr chunk (state=3): >>><<< 11701 1727096119.25736: stdout chunk (state=3): >>><<< 11701 1727096119.25760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096119.25775: _low_level_execute_command(): starting 11701 1727096119.25780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050 `" && echo ansible-tmp-1727096119.2576058-11885-56344382367050="` echo /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050 `" ) && sleep 0' 11701 1727096119.26392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096119.26416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096119.26459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096119.26494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096119.28475: stdout chunk (state=3): >>>ansible-tmp-1727096119.2576058-11885-56344382367050=/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050 <<< 11701 1727096119.28647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096119.28651: stdout chunk (state=3): >>><<< 11701 1727096119.28654: stderr chunk (state=3): >>><<< 11701 1727096119.28874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096119.2576058-11885-56344382367050=/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096119.28877: variable 'ansible_module_compression' from source: unknown 11701 1727096119.28880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11701 1727096119.28882: variable 'ansible_facts' from source: unknown 11701 1727096119.29072: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py 11701 1727096119.29355: Sending initial data 11701 1727096119.29359: Sent initial data (153 bytes) 11701 1727096119.30803: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096119.30896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096119.30929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096119.31090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096119.32801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096119.32908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096119.32947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpk30ajqub /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py <<< 11701 1727096119.32970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py" <<< 11701 1727096119.33110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpk30ajqub" to remote "/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py" <<< 11701 1727096119.35633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096119.35720: stderr chunk (state=3): >>><<< 11701 1727096119.35879: stdout chunk (state=3): >>><<< 11701 1727096119.35882: done transferring module to remote 11701 1727096119.35884: _low_level_execute_command(): starting 11701 1727096119.35886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/ /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py && sleep 0' 11701 1727096119.37111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096119.37135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096119.37302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096119.37305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096119.37359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096119.37388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096119.39329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096119.39361: stderr chunk (state=3): >>><<< 11701 1727096119.39373: stdout chunk (state=3): >>><<< 11701 1727096119.39513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096119.39516: _low_level_execute_command(): starting 11701 1727096119.39519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/AnsiballZ_setup.py && sleep 0' 11701 1727096119.40776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096119.40830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096119.40863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096119.41087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.05377: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.58447265625, "5m": 0.50732421875, "15m": 0.23681640625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "19", "epoch": "1727096119", "epoch_int": "1727096119", "date": "2024-09-23", "time": "08:55:19", "iso8601_micro": "2024-09-23T12:55:19.689334Z", "iso8601": "2024-09-23T12:55:19Z", "iso8601_basic": "20240923T085519689334", "iso8601_basic_short": "20240923T085519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3<<< 11701 1727096120.05413: stdout chunk (state=3): >>>, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2986, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 545, "free": 2986}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_devic<<< 11701 1727096120.05552: stdout chunk (state=3): >>>e_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 262, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806329856, "block_size": 4096, "block_total": 65519099, "block_available": 63917561, "block_used": 1601538, "inode_total": 131070960, "inode_available": 131029186, "inode_used": 41774, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11701 1727096120.07777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096120.07781: stdout chunk (state=3): >>><<< 11701 1727096120.07783: stderr chunk (state=3): >>><<< 11701 1727096120.07787: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.58447265625, "5m": 0.50732421875, "15m": 0.23681640625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "19", "epoch": "1727096119", "epoch_int": "1727096119", "date": "2024-09-23", "time": "08:55:19", "iso8601_micro": "2024-09-23T12:55:19.689334Z", "iso8601": "2024-09-23T12:55:19Z", "iso8601_basic": "20240923T085519689334", "iso8601_basic_short": "20240923T085519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2986, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 545, "free": 2986}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 262, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806329856, "block_size": 4096, "block_total": 65519099, "block_available": 63917561, "block_used": 1601538, "inode_total": 131070960, "inode_available": 131029186, "inode_used": 41774, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096120.08535: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096120.08579: _low_level_execute_command(): starting 11701 1727096120.08594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096119.2576058-11885-56344382367050/ > /dev/null 2>&1 && sleep 0' 11701 1727096120.10021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096120.10064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096120.10169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.10190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.10263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.12227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096120.12232: stdout chunk (state=3): >>><<< 11701 1727096120.12235: stderr chunk (state=3): >>><<< 11701 1727096120.12407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096120.12410: handler run complete 11701 1727096120.12528: variable 'ansible_facts' from source: unknown 11701 1727096120.12686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.13194: variable 'ansible_facts' from source: unknown 11701 1727096120.13241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.13375: attempt loop complete, returning result 11701 1727096120.13384: _execute() done 11701 1727096120.13390: dumping result to json 11701 1727096120.13437: done dumping result, returning 11701 1727096120.13496: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-a05c-c957-000000000128] 11701 1727096120.13590: sending task result for task 0afff68d-5257-a05c-c957-000000000128 11701 1727096120.15176: done sending task result for task 0afff68d-5257-a05c-c957-000000000128 11701 1727096120.15191: WORKER PROCESS EXITING ok: [managed_node3] 11701 1727096120.15696: no more pending results, returning what we have 11701 1727096120.15699: results queue empty 11701 1727096120.15700: checking for any_errors_fatal 11701 1727096120.15701: done checking for any_errors_fatal 11701 1727096120.15702: checking for max_fail_percentage 11701 1727096120.15703: done checking for max_fail_percentage 11701 1727096120.15704: checking to see if all hosts have failed and the running result is not ok 11701 1727096120.15705: done checking to see if all hosts have failed 11701 1727096120.15706: getting the remaining hosts for this loop 11701 1727096120.15707: done getting the remaining hosts for this loop 11701 1727096120.15711: getting the next task for host managed_node3 11701 1727096120.15716: done getting next task for host managed_node3 11701 1727096120.15718: ^ task is: TASK: meta (flush_handlers) 11701 1727096120.15719: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096120.15723: getting variables 11701 1727096120.15724: in VariableManager get_vars() 11701 1727096120.15824: Calling all_inventory to load vars for managed_node3 11701 1727096120.15828: Calling groups_inventory to load vars for managed_node3 11701 1727096120.15905: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096120.15917: Calling all_plugins_play to load vars for managed_node3 11701 1727096120.15924: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096120.15927: Calling groups_plugins_play to load vars for managed_node3 11701 1727096120.16346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.16616: done with get_vars() 11701 1727096120.16629: done getting variables 11701 1727096120.16710: in VariableManager get_vars() 11701 1727096120.16725: Calling all_inventory to load vars for managed_node3 11701 1727096120.16727: Calling groups_inventory to load vars for managed_node3 11701 1727096120.16729: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096120.16734: Calling all_plugins_play to load vars for managed_node3 11701 1727096120.16736: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096120.16738: Calling groups_plugins_play to load vars for managed_node3 11701 1727096120.16897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.17164: done with get_vars() 11701 1727096120.17183: done queuing things up, now waiting for results queue to drain 11701 1727096120.17231: results queue empty 11701 1727096120.17244: checking for any_errors_fatal 11701 1727096120.17251: done checking for any_errors_fatal 11701 1727096120.17252: checking for max_fail_percentage 11701 1727096120.17254: done checking for max_fail_percentage 11701 1727096120.17254: checking to see if all hosts have failed and the running result is not ok 11701 1727096120.17255: done checking to see if all hosts have failed 11701 1727096120.17260: getting the remaining hosts for this loop 11701 1727096120.17261: done getting the remaining hosts for this loop 11701 1727096120.17264: getting the next task for host managed_node3 11701 1727096120.17270: done getting next task for host managed_node3 11701 1727096120.17273: ^ task is: TASK: INIT Prepare setup 11701 1727096120.17274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096120.17276: getting variables 11701 1727096120.17277: in VariableManager get_vars() 11701 1727096120.17330: Calling all_inventory to load vars for managed_node3 11701 1727096120.17333: Calling groups_inventory to load vars for managed_node3 11701 1727096120.17335: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096120.17340: Calling all_plugins_play to load vars for managed_node3 11701 1727096120.17342: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096120.17345: Calling groups_plugins_play to load vars for managed_node3 11701 1727096120.17588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.17848: done with get_vars() 11701 1727096120.17860: done getting variables 11701 1727096120.17939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Monday 23 September 2024 08:55:20 -0400 (0:00:00.963) 0:00:04.144 ****** 11701 1727096120.17979: entering _queue_task() for managed_node3/debug 11701 1727096120.17999: Creating lock for debug 11701 1727096120.18552: worker is 1 (out of 1 available) 11701 1727096120.18565: exiting _queue_task() for managed_node3/debug 11701 1727096120.18577: done queuing things up, now waiting for results queue to drain 11701 1727096120.18579: waiting for pending results... 11701 1727096120.18890: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 11701 1727096120.18971: in run() - task 0afff68d-5257-a05c-c957-00000000000b 11701 1727096120.19014: variable 'ansible_search_path' from source: unknown 11701 1727096120.19083: calling self._execute() 11701 1727096120.19299: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.19318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.19337: variable 'omit' from source: magic vars 11701 1727096120.20141: variable 'ansible_distribution_major_version' from source: facts 11701 1727096120.20272: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096120.20278: variable 'omit' from source: magic vars 11701 1727096120.20281: variable 'omit' from source: magic vars 11701 1727096120.20284: variable 'omit' from source: magic vars 11701 1727096120.20286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096120.20406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096120.20410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096120.20412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096120.20414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096120.20433: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096120.20441: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.20448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.20629: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096120.20633: Set connection var ansible_timeout to 10 11701 1727096120.20636: Set connection var ansible_shell_type to sh 11701 1727096120.20665: Set connection var ansible_shell_executable to /bin/sh 11701 1727096120.20676: Set connection var ansible_connection to ssh 11701 1727096120.20690: Set connection var ansible_pipelining to False 11701 1727096120.20717: variable 'ansible_shell_executable' from source: unknown 11701 1727096120.20733: variable 'ansible_connection' from source: unknown 11701 1727096120.20740: variable 'ansible_module_compression' from source: unknown 11701 1727096120.20747: variable 'ansible_shell_type' from source: unknown 11701 1727096120.20756: variable 'ansible_shell_executable' from source: unknown 11701 1727096120.20763: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.20774: variable 'ansible_pipelining' from source: unknown 11701 1727096120.20960: variable 'ansible_timeout' from source: unknown 11701 1727096120.20964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.21240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096120.21291: variable 'omit' from source: magic vars 11701 1727096120.21307: starting attempt loop 11701 1727096120.21315: running the handler 11701 1727096120.21535: handler run complete 11701 1727096120.21618: attempt loop complete, returning result 11701 1727096120.21714: _execute() done 11701 1727096120.21717: dumping result to json 11701 1727096120.21719: done dumping result, returning 11701 1727096120.21722: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [0afff68d-5257-a05c-c957-00000000000b] 11701 1727096120.21724: sending task result for task 0afff68d-5257-a05c-c957-00000000000b 11701 1727096120.21818: done sending task result for task 0afff68d-5257-a05c-c957-00000000000b 11701 1727096120.21822: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 11701 1727096120.21888: no more pending results, returning what we have 11701 1727096120.21892: results queue empty 11701 1727096120.21899: checking for any_errors_fatal 11701 1727096120.21902: done checking for any_errors_fatal 11701 1727096120.21902: checking for max_fail_percentage 11701 1727096120.21905: done checking for max_fail_percentage 11701 1727096120.21906: checking to see if all hosts have failed and the running result is not ok 11701 1727096120.21907: done checking to see if all hosts have failed 11701 1727096120.21908: getting the remaining hosts for this loop 11701 1727096120.21909: done getting the remaining hosts for this loop 11701 1727096120.21913: getting the next task for host managed_node3 11701 1727096120.21920: done getting next task for host managed_node3 11701 1727096120.21923: ^ task is: TASK: Install dnsmasq 11701 1727096120.21926: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096120.21930: getting variables 11701 1727096120.21934: in VariableManager get_vars() 11701 1727096120.21983: Calling all_inventory to load vars for managed_node3 11701 1727096120.21986: Calling groups_inventory to load vars for managed_node3 11701 1727096120.21989: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096120.22001: Calling all_plugins_play to load vars for managed_node3 11701 1727096120.22003: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096120.22006: Calling groups_plugins_play to load vars for managed_node3 11701 1727096120.22717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096120.22996: done with get_vars() 11701 1727096120.23009: done getting variables 11701 1727096120.23178: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:20 -0400 (0:00:00.052) 0:00:04.196 ****** 11701 1727096120.23210: entering _queue_task() for managed_node3/package 11701 1727096120.23574: worker is 1 (out of 1 available) 11701 1727096120.23587: exiting _queue_task() for managed_node3/package 11701 1727096120.23713: done queuing things up, now waiting for results queue to drain 11701 1727096120.23715: waiting for pending results... 11701 1727096120.24180: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 11701 1727096120.24185: in run() - task 0afff68d-5257-a05c-c957-00000000000f 11701 1727096120.24196: variable 'ansible_search_path' from source: unknown 11701 1727096120.24203: variable 'ansible_search_path' from source: unknown 11701 1727096120.24205: calling self._execute() 11701 1727096120.24207: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.24210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.24301: variable 'omit' from source: magic vars 11701 1727096120.24933: variable 'ansible_distribution_major_version' from source: facts 11701 1727096120.24957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096120.24978: variable 'omit' from source: magic vars 11701 1727096120.25072: variable 'omit' from source: magic vars 11701 1727096120.25340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096120.28710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096120.28719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096120.28774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096120.28917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096120.29042: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096120.29239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096120.29294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096120.29364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096120.29536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096120.29540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096120.29779: variable '__network_is_ostree' from source: set_fact 11701 1727096120.29783: variable 'omit' from source: magic vars 11701 1727096120.29785: variable 'omit' from source: magic vars 11701 1727096120.29835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096120.29943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096120.30020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096120.30043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096120.30059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096120.30103: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096120.30137: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.30150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.30346: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096120.30349: Set connection var ansible_timeout to 10 11701 1727096120.30352: Set connection var ansible_shell_type to sh 11701 1727096120.30389: Set connection var ansible_shell_executable to /bin/sh 11701 1727096120.30581: Set connection var ansible_connection to ssh 11701 1727096120.30588: Set connection var ansible_pipelining to False 11701 1727096120.30591: variable 'ansible_shell_executable' from source: unknown 11701 1727096120.30593: variable 'ansible_connection' from source: unknown 11701 1727096120.30595: variable 'ansible_module_compression' from source: unknown 11701 1727096120.30597: variable 'ansible_shell_type' from source: unknown 11701 1727096120.30599: variable 'ansible_shell_executable' from source: unknown 11701 1727096120.30605: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096120.30608: variable 'ansible_pipelining' from source: unknown 11701 1727096120.30613: variable 'ansible_timeout' from source: unknown 11701 1727096120.30615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096120.30956: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096120.30960: variable 'omit' from source: magic vars 11701 1727096120.30962: starting attempt loop 11701 1727096120.30964: running the handler 11701 1727096120.30975: variable 'ansible_facts' from source: unknown 11701 1727096120.30978: variable 'ansible_facts' from source: unknown 11701 1727096120.31127: _low_level_execute_command(): starting 11701 1727096120.31130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096120.32419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.32508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.32583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.34307: stdout chunk (state=3): >>>/root <<< 11701 1727096120.34474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096120.34478: stdout chunk (state=3): >>><<< 11701 1727096120.34480: stderr chunk (state=3): >>><<< 11701 1727096120.34501: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096120.34607: _low_level_execute_command(): starting 11701 1727096120.34611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554 `" && echo ansible-tmp-1727096120.3451424-11936-70950107551554="` echo /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554 `" ) && sleep 0' 11701 1727096120.35180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096120.35203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096120.35221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096120.35238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096120.35255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096120.35266: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096120.35282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096120.35322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096120.35400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096120.35428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.35452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.35538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.37822: stdout chunk (state=3): >>>ansible-tmp-1727096120.3451424-11936-70950107551554=/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554 <<< 11701 1727096120.37827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096120.37830: stdout chunk (state=3): >>><<< 11701 1727096120.37832: stderr chunk (state=3): >>><<< 11701 1727096120.37978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096120.3451424-11936-70950107551554=/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096120.37982: variable 'ansible_module_compression' from source: unknown 11701 1727096120.37985: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11701 1727096120.37988: ANSIBALLZ: Acquiring lock 11701 1727096120.37990: ANSIBALLZ: Lock acquired: 139907404354416 11701 1727096120.37992: ANSIBALLZ: Creating module 11701 1727096120.55172: ANSIBALLZ: Writing module into payload 11701 1727096120.55371: ANSIBALLZ: Writing module 11701 1727096120.55403: ANSIBALLZ: Renaming module 11701 1727096120.55408: ANSIBALLZ: Done creating module 11701 1727096120.55420: variable 'ansible_facts' from source: unknown 11701 1727096120.55517: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py 11701 1727096120.55844: Sending initial data 11701 1727096120.55848: Sent initial data (151 bytes) 11701 1727096120.56370: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096120.56376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096120.56479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096120.56501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.56549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.56580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.58296: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096120.58334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096120.58402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpivl_3_fk /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py <<< 11701 1727096120.58406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py" <<< 11701 1727096120.58438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpivl_3_fk" to remote "/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py" <<< 11701 1727096120.59422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096120.59456: stderr chunk (state=3): >>><<< 11701 1727096120.59550: stdout chunk (state=3): >>><<< 11701 1727096120.59553: done transferring module to remote 11701 1727096120.59555: _low_level_execute_command(): starting 11701 1727096120.59558: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/ /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py && sleep 0' 11701 1727096120.60284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096120.60288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096120.60360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096120.60386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.60401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.60486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096120.62426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096120.62431: stdout chunk (state=3): >>><<< 11701 1727096120.62433: stderr chunk (state=3): >>><<< 11701 1727096120.62539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096120.62554: _low_level_execute_command(): starting 11701 1727096120.62557: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/AnsiballZ_dnf.py && sleep 0' 11701 1727096120.63147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096120.63200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096120.63251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.06618: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11701 1727096121.26544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096121.26548: stdout chunk (state=3): >>><<< 11701 1727096121.26550: stderr chunk (state=3): >>><<< 11701 1727096121.26553: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096121.26760: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096121.26764: _low_level_execute_command(): starting 11701 1727096121.26767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096120.3451424-11936-70950107551554/ > /dev/null 2>&1 && sleep 0' 11701 1727096121.27447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096121.27463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096121.27483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.27501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096121.27530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096121.27593: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.27679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.27971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.28024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.30332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096121.30336: stdout chunk (state=3): >>><<< 11701 1727096121.30339: stderr chunk (state=3): >>><<< 11701 1727096121.30341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096121.30343: handler run complete 11701 1727096121.30525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096121.30946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096121.31093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096121.31131: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096121.31171: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096121.31374: variable '__install_status' from source: unknown 11701 1727096121.31400: Evaluated conditional (__install_status is success): True 11701 1727096121.31484: attempt loop complete, returning result 11701 1727096121.31493: _execute() done 11701 1727096121.31500: dumping result to json 11701 1727096121.31510: done dumping result, returning 11701 1727096121.31530: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0afff68d-5257-a05c-c957-00000000000f] 11701 1727096121.31539: sending task result for task 0afff68d-5257-a05c-c957-00000000000f ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11701 1727096121.31835: no more pending results, returning what we have 11701 1727096121.31839: results queue empty 11701 1727096121.31841: checking for any_errors_fatal 11701 1727096121.31852: done checking for any_errors_fatal 11701 1727096121.31853: checking for max_fail_percentage 11701 1727096121.31855: done checking for max_fail_percentage 11701 1727096121.31856: checking to see if all hosts have failed and the running result is not ok 11701 1727096121.31857: done checking to see if all hosts have failed 11701 1727096121.31858: getting the remaining hosts for this loop 11701 1727096121.31859: done getting the remaining hosts for this loop 11701 1727096121.31863: getting the next task for host managed_node3 11701 1727096121.31872: done getting next task for host managed_node3 11701 1727096121.31875: ^ task is: TASK: Install pgrep, sysctl 11701 1727096121.31878: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096121.31882: getting variables 11701 1727096121.31884: in VariableManager get_vars() 11701 1727096121.31927: Calling all_inventory to load vars for managed_node3 11701 1727096121.31930: Calling groups_inventory to load vars for managed_node3 11701 1727096121.31933: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096121.31945: Calling all_plugins_play to load vars for managed_node3 11701 1727096121.31948: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096121.31951: Calling groups_plugins_play to load vars for managed_node3 11701 1727096121.32853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096121.33396: done with get_vars() 11701 1727096121.33408: done getting variables 11701 1727096121.33438: done sending task result for task 0afff68d-5257-a05c-c957-00000000000f 11701 1727096121.33441: WORKER PROCESS EXITING 11701 1727096121.33597: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Monday 23 September 2024 08:55:21 -0400 (0:00:01.104) 0:00:05.300 ****** 11701 1727096121.33634: entering _queue_task() for managed_node3/package 11701 1727096121.34041: worker is 1 (out of 1 available) 11701 1727096121.34056: exiting _queue_task() for managed_node3/package 11701 1727096121.34070: done queuing things up, now waiting for results queue to drain 11701 1727096121.34071: waiting for pending results... 11701 1727096121.34332: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11701 1727096121.34453: in run() - task 0afff68d-5257-a05c-c957-000000000010 11701 1727096121.34475: variable 'ansible_search_path' from source: unknown 11701 1727096121.34482: variable 'ansible_search_path' from source: unknown 11701 1727096121.34527: calling self._execute() 11701 1727096121.34616: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096121.34628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096121.34642: variable 'omit' from source: magic vars 11701 1727096121.35017: variable 'ansible_distribution_major_version' from source: facts 11701 1727096121.35042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096121.35173: variable 'ansible_os_family' from source: facts 11701 1727096121.35184: Evaluated conditional (ansible_os_family == 'RedHat'): True 11701 1727096121.35371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096121.35659: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096121.35717: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096121.35799: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096121.35841: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096121.36018: variable 'ansible_distribution_major_version' from source: facts 11701 1727096121.36022: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11701 1727096121.36024: when evaluation is False, skipping this task 11701 1727096121.36026: _execute() done 11701 1727096121.36028: dumping result to json 11701 1727096121.36030: done dumping result, returning 11701 1727096121.36031: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0afff68d-5257-a05c-c957-000000000010] 11701 1727096121.36033: sending task result for task 0afff68d-5257-a05c-c957-000000000010 11701 1727096121.36109: done sending task result for task 0afff68d-5257-a05c-c957-000000000010 11701 1727096121.36113: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11701 1727096121.36172: no more pending results, returning what we have 11701 1727096121.36177: results queue empty 11701 1727096121.36178: checking for any_errors_fatal 11701 1727096121.36186: done checking for any_errors_fatal 11701 1727096121.36187: checking for max_fail_percentage 11701 1727096121.36189: done checking for max_fail_percentage 11701 1727096121.36190: checking to see if all hosts have failed and the running result is not ok 11701 1727096121.36191: done checking to see if all hosts have failed 11701 1727096121.36192: getting the remaining hosts for this loop 11701 1727096121.36194: done getting the remaining hosts for this loop 11701 1727096121.36198: getting the next task for host managed_node3 11701 1727096121.36205: done getting next task for host managed_node3 11701 1727096121.36209: ^ task is: TASK: Install pgrep, sysctl 11701 1727096121.36212: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096121.36215: getting variables 11701 1727096121.36218: in VariableManager get_vars() 11701 1727096121.36381: Calling all_inventory to load vars for managed_node3 11701 1727096121.36385: Calling groups_inventory to load vars for managed_node3 11701 1727096121.36388: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096121.36402: Calling all_plugins_play to load vars for managed_node3 11701 1727096121.36405: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096121.36408: Calling groups_plugins_play to load vars for managed_node3 11701 1727096121.36774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096121.37105: done with get_vars() 11701 1727096121.37118: done getting variables 11701 1727096121.37295: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Monday 23 September 2024 08:55:21 -0400 (0:00:00.036) 0:00:05.337 ****** 11701 1727096121.37327: entering _queue_task() for managed_node3/package 11701 1727096121.37900: worker is 1 (out of 1 available) 11701 1727096121.38073: exiting _queue_task() for managed_node3/package 11701 1727096121.38084: done queuing things up, now waiting for results queue to drain 11701 1727096121.38086: waiting for pending results... 11701 1727096121.38745: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11701 1727096121.38753: in run() - task 0afff68d-5257-a05c-c957-000000000011 11701 1727096121.38757: variable 'ansible_search_path' from source: unknown 11701 1727096121.38760: variable 'ansible_search_path' from source: unknown 11701 1727096121.38762: calling self._execute() 11701 1727096121.38765: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096121.38769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096121.38772: variable 'omit' from source: magic vars 11701 1727096121.39162: variable 'ansible_distribution_major_version' from source: facts 11701 1727096121.39182: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096121.39307: variable 'ansible_os_family' from source: facts 11701 1727096121.39317: Evaluated conditional (ansible_os_family == 'RedHat'): True 11701 1727096121.39565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096121.39780: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096121.39833: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096121.39872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096121.39920: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096121.40019: variable 'ansible_distribution_major_version' from source: facts 11701 1727096121.40038: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11701 1727096121.40049: variable 'omit' from source: magic vars 11701 1727096121.40112: variable 'omit' from source: magic vars 11701 1727096121.40272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096121.42387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096121.42712: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096121.42716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096121.42875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096121.42921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096121.43026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096121.43059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096121.43092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096121.43142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096121.43163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096121.43269: variable '__network_is_ostree' from source: set_fact 11701 1727096121.43280: variable 'omit' from source: magic vars 11701 1727096121.43310: variable 'omit' from source: magic vars 11701 1727096121.43342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096121.43373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096121.43396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096121.43416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096121.43436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096121.43474: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096121.43482: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096121.43488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096121.43588: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096121.43598: Set connection var ansible_timeout to 10 11701 1727096121.43605: Set connection var ansible_shell_type to sh 11701 1727096121.43613: Set connection var ansible_shell_executable to /bin/sh 11701 1727096121.43619: Set connection var ansible_connection to ssh 11701 1727096121.43631: Set connection var ansible_pipelining to False 11701 1727096121.43665: variable 'ansible_shell_executable' from source: unknown 11701 1727096121.43675: variable 'ansible_connection' from source: unknown 11701 1727096121.43682: variable 'ansible_module_compression' from source: unknown 11701 1727096121.43875: variable 'ansible_shell_type' from source: unknown 11701 1727096121.43878: variable 'ansible_shell_executable' from source: unknown 11701 1727096121.43881: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096121.43883: variable 'ansible_pipelining' from source: unknown 11701 1727096121.43885: variable 'ansible_timeout' from source: unknown 11701 1727096121.43887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096121.44003: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096121.44018: variable 'omit' from source: magic vars 11701 1727096121.44029: starting attempt loop 11701 1727096121.44036: running the handler 11701 1727096121.44048: variable 'ansible_facts' from source: unknown 11701 1727096121.44202: variable 'ansible_facts' from source: unknown 11701 1727096121.44205: _low_level_execute_command(): starting 11701 1727096121.44207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096121.44981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.45023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096121.45050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.45077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.45152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.47009: stdout chunk (state=3): >>>/root <<< 11701 1727096121.47161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096121.47166: stdout chunk (state=3): >>><<< 11701 1727096121.47171: stderr chunk (state=3): >>><<< 11701 1727096121.47196: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096121.47379: _low_level_execute_command(): starting 11701 1727096121.47382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516 `" && echo ansible-tmp-1727096121.4727623-11994-60369413102516="` echo /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516 `" ) && sleep 0' 11701 1727096121.48492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096121.48496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096121.48499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.48502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.48504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.48612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096121.48618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.48621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.48884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.50674: stdout chunk (state=3): >>>ansible-tmp-1727096121.4727623-11994-60369413102516=/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516 <<< 11701 1727096121.50824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096121.50834: stdout chunk (state=3): >>><<< 11701 1727096121.50853: stderr chunk (state=3): >>><<< 11701 1727096121.50879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096121.4727623-11994-60369413102516=/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096121.50918: variable 'ansible_module_compression' from source: unknown 11701 1727096121.51079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11701 1727096121.51082: variable 'ansible_facts' from source: unknown 11701 1727096121.51173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py 11701 1727096121.51426: Sending initial data 11701 1727096121.51430: Sent initial data (151 bytes) 11701 1727096121.52038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096121.52066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.52174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.52205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.52277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.53933: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096121.53962: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096121.54017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096121.54072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp1utdlo5d /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py <<< 11701 1727096121.54076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py" <<< 11701 1727096121.54106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp1utdlo5d" to remote "/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py" <<< 11701 1727096121.55107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096121.55123: stderr chunk (state=3): >>><<< 11701 1727096121.55126: stdout chunk (state=3): >>><<< 11701 1727096121.55576: done transferring module to remote 11701 1727096121.55579: _low_level_execute_command(): starting 11701 1727096121.55582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/ /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py && sleep 0' 11701 1727096121.55951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096121.55964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096121.55978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.55992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096121.56005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096121.56011: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096121.56028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.56044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096121.56051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096121.56061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096121.56071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096121.56082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.56141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.56176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096121.56190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.56208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.56279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096121.58376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096121.58380: stdout chunk (state=3): >>><<< 11701 1727096121.58383: stderr chunk (state=3): >>><<< 11701 1727096121.58386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096121.58388: _low_level_execute_command(): starting 11701 1727096121.58391: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/AnsiballZ_dnf.py && sleep 0' 11701 1727096121.58988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096121.58996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096121.59008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.59023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096121.59042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096121.59055: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096121.59062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.59081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096121.59089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096121.59096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096121.59104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096121.59114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096121.59126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096121.59133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096121.59145: stderr chunk (state=3): >>>debug2: match found <<< 11701 1727096121.59164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096121.59226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096121.59239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096121.59273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096121.59355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.03304: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11701 1727096122.08174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096122.08205: stderr chunk (state=3): >>><<< 11701 1727096122.08208: stdout chunk (state=3): >>><<< 11701 1727096122.08229: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096122.08261: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096122.08269: _low_level_execute_command(): starting 11701 1727096122.08275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096121.4727623-11994-60369413102516/ > /dev/null 2>&1 && sleep 0' 11701 1727096122.08748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096122.08752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.08756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096122.08758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.08814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096122.08817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.08819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.08864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.10757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096122.10785: stderr chunk (state=3): >>><<< 11701 1727096122.10789: stdout chunk (state=3): >>><<< 11701 1727096122.10807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096122.10814: handler run complete 11701 1727096122.10838: attempt loop complete, returning result 11701 1727096122.10841: _execute() done 11701 1727096122.10844: dumping result to json 11701 1727096122.10849: done dumping result, returning 11701 1727096122.10860: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0afff68d-5257-a05c-c957-000000000011] 11701 1727096122.10862: sending task result for task 0afff68d-5257-a05c-c957-000000000011 11701 1727096122.10959: done sending task result for task 0afff68d-5257-a05c-c957-000000000011 11701 1727096122.10962: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11701 1727096122.11031: no more pending results, returning what we have 11701 1727096122.11035: results queue empty 11701 1727096122.11036: checking for any_errors_fatal 11701 1727096122.11042: done checking for any_errors_fatal 11701 1727096122.11042: checking for max_fail_percentage 11701 1727096122.11044: done checking for max_fail_percentage 11701 1727096122.11045: checking to see if all hosts have failed and the running result is not ok 11701 1727096122.11046: done checking to see if all hosts have failed 11701 1727096122.11047: getting the remaining hosts for this loop 11701 1727096122.11048: done getting the remaining hosts for this loop 11701 1727096122.11051: getting the next task for host managed_node3 11701 1727096122.11057: done getting next task for host managed_node3 11701 1727096122.11059: ^ task is: TASK: Create test interfaces 11701 1727096122.11063: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096122.11073: getting variables 11701 1727096122.11075: in VariableManager get_vars() 11701 1727096122.11117: Calling all_inventory to load vars for managed_node3 11701 1727096122.11120: Calling groups_inventory to load vars for managed_node3 11701 1727096122.11122: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096122.11133: Calling all_plugins_play to load vars for managed_node3 11701 1727096122.11135: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096122.11138: Calling groups_plugins_play to load vars for managed_node3 11701 1727096122.11438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096122.11666: done with get_vars() 11701 1727096122.11680: done getting variables 11701 1727096122.11779: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Monday 23 September 2024 08:55:22 -0400 (0:00:00.744) 0:00:06.082 ****** 11701 1727096122.11810: entering _queue_task() for managed_node3/shell 11701 1727096122.11811: Creating lock for shell 11701 1727096122.12294: worker is 1 (out of 1 available) 11701 1727096122.12305: exiting _queue_task() for managed_node3/shell 11701 1727096122.12314: done queuing things up, now waiting for results queue to drain 11701 1727096122.12316: waiting for pending results... 11701 1727096122.12429: running TaskExecutor() for managed_node3/TASK: Create test interfaces 11701 1727096122.12507: in run() - task 0afff68d-5257-a05c-c957-000000000012 11701 1727096122.12519: variable 'ansible_search_path' from source: unknown 11701 1727096122.12523: variable 'ansible_search_path' from source: unknown 11701 1727096122.12553: calling self._execute() 11701 1727096122.12620: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096122.12623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096122.12633: variable 'omit' from source: magic vars 11701 1727096122.12906: variable 'ansible_distribution_major_version' from source: facts 11701 1727096122.12917: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096122.12929: variable 'omit' from source: magic vars 11701 1727096122.12962: variable 'omit' from source: magic vars 11701 1727096122.13212: variable 'dhcp_interface1' from source: play vars 11701 1727096122.13218: variable 'dhcp_interface2' from source: play vars 11701 1727096122.13244: variable 'omit' from source: magic vars 11701 1727096122.13284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096122.13312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096122.13329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096122.13342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096122.13355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096122.13382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096122.13385: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096122.13388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096122.13460: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096122.13463: Set connection var ansible_timeout to 10 11701 1727096122.13469: Set connection var ansible_shell_type to sh 11701 1727096122.13480: Set connection var ansible_shell_executable to /bin/sh 11701 1727096122.13483: Set connection var ansible_connection to ssh 11701 1727096122.13485: Set connection var ansible_pipelining to False 11701 1727096122.13504: variable 'ansible_shell_executable' from source: unknown 11701 1727096122.13507: variable 'ansible_connection' from source: unknown 11701 1727096122.13509: variable 'ansible_module_compression' from source: unknown 11701 1727096122.13511: variable 'ansible_shell_type' from source: unknown 11701 1727096122.13513: variable 'ansible_shell_executable' from source: unknown 11701 1727096122.13515: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096122.13520: variable 'ansible_pipelining' from source: unknown 11701 1727096122.13522: variable 'ansible_timeout' from source: unknown 11701 1727096122.13526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096122.13632: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096122.13641: variable 'omit' from source: magic vars 11701 1727096122.13647: starting attempt loop 11701 1727096122.13649: running the handler 11701 1727096122.13661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096122.13677: _low_level_execute_command(): starting 11701 1727096122.13683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096122.14374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096122.14690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096122.14706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.14725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.14795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.16510: stdout chunk (state=3): >>>/root <<< 11701 1727096122.16730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096122.16764: stderr chunk (state=3): >>><<< 11701 1727096122.16770: stdout chunk (state=3): >>><<< 11701 1727096122.16796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096122.16810: _low_level_execute_command(): starting 11701 1727096122.16817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393 `" && echo ansible-tmp-1727096122.167961-12029-29829723691393="` echo /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393 `" ) && sleep 0' 11701 1727096122.17985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096122.17989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096122.18001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096122.18004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096122.18006: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.18055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096122.18060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.18069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.18106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.20576: stdout chunk (state=3): >>>ansible-tmp-1727096122.167961-12029-29829723691393=/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393 <<< 11701 1727096122.20580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096122.20582: stdout chunk (state=3): >>><<< 11701 1727096122.20585: stderr chunk (state=3): >>><<< 11701 1727096122.20587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096122.167961-12029-29829723691393=/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096122.20589: variable 'ansible_module_compression' from source: unknown 11701 1727096122.20591: ANSIBALLZ: Using generic lock for ansible.legacy.command 11701 1727096122.20593: ANSIBALLZ: Acquiring lock 11701 1727096122.20595: ANSIBALLZ: Lock acquired: 139907404354416 11701 1727096122.20596: ANSIBALLZ: Creating module 11701 1727096122.37056: ANSIBALLZ: Writing module into payload 11701 1727096122.37150: ANSIBALLZ: Writing module 11701 1727096122.37180: ANSIBALLZ: Renaming module 11701 1727096122.37198: ANSIBALLZ: Done creating module 11701 1727096122.37219: variable 'ansible_facts' from source: unknown 11701 1727096122.37299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py 11701 1727096122.37534: Sending initial data 11701 1727096122.37544: Sent initial data (154 bytes) 11701 1727096122.38194: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096122.38206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.38223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.38295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.39963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096122.39994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096122.40034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp_996e0mb /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py <<< 11701 1727096122.40038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py" <<< 11701 1727096122.40072: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp_996e0mb" to remote "/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py" <<< 11701 1727096122.40090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py" <<< 11701 1727096122.40613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096122.40659: stderr chunk (state=3): >>><<< 11701 1727096122.40664: stdout chunk (state=3): >>><<< 11701 1727096122.40708: done transferring module to remote 11701 1727096122.40716: _low_level_execute_command(): starting 11701 1727096122.40721: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/ /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py && sleep 0' 11701 1727096122.41416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.41446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096122.41466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.41487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.41648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096122.43563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096122.43578: stdout chunk (state=3): >>><<< 11701 1727096122.43592: stderr chunk (state=3): >>><<< 11701 1727096122.43666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096122.43672: _low_level_execute_command(): starting 11701 1727096122.43674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/AnsiballZ_command.py && sleep 0' 11701 1727096122.44230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096122.44243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096122.44282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.44302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096122.44318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096122.44390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096122.44406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096122.44455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096123.83208: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 11701 1727096123.83218: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:22.602092", "end": "2024-09-23 08:55:23.828774", "delta": "0:00:01.226682", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096123.84877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096123.84881: stdout chunk (state=3): >>><<< 11701 1727096123.84883: stderr chunk (state=3): >>><<< 11701 1727096123.84886: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:22.602092", "end": "2024-09-23 08:55:23.828774", "delta": "0:00:01.226682", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096123.84895: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096123.84912: _low_level_execute_command(): starting 11701 1727096123.84922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096122.167961-12029-29829723691393/ > /dev/null 2>&1 && sleep 0' 11701 1727096123.85517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096123.85534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096123.85588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096123.85601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096123.85653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096123.87571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096123.87603: stderr chunk (state=3): >>><<< 11701 1727096123.87607: stdout chunk (state=3): >>><<< 11701 1727096123.87621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096123.87628: handler run complete 11701 1727096123.87645: Evaluated conditional (False): False 11701 1727096123.87654: attempt loop complete, returning result 11701 1727096123.87657: _execute() done 11701 1727096123.87659: dumping result to json 11701 1727096123.87665: done dumping result, returning 11701 1727096123.87675: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0afff68d-5257-a05c-c957-000000000012] 11701 1727096123.87679: sending task result for task 0afff68d-5257-a05c-c957-000000000012 11701 1727096123.87787: done sending task result for task 0afff68d-5257-a05c-c957-000000000012 11701 1727096123.87791: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.226682", "end": "2024-09-23 08:55:23.828774", "rc": 0, "start": "2024-09-23 08:55:22.602092" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11701 1727096123.87871: no more pending results, returning what we have 11701 1727096123.87875: results queue empty 11701 1727096123.87876: checking for any_errors_fatal 11701 1727096123.87883: done checking for any_errors_fatal 11701 1727096123.87884: checking for max_fail_percentage 11701 1727096123.87885: done checking for max_fail_percentage 11701 1727096123.87886: checking to see if all hosts have failed and the running result is not ok 11701 1727096123.87887: done checking to see if all hosts have failed 11701 1727096123.87888: getting the remaining hosts for this loop 11701 1727096123.87889: done getting the remaining hosts for this loop 11701 1727096123.87892: getting the next task for host managed_node3 11701 1727096123.87904: done getting next task for host managed_node3 11701 1727096123.87907: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11701 1727096123.87910: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096123.87913: getting variables 11701 1727096123.87915: in VariableManager get_vars() 11701 1727096123.87955: Calling all_inventory to load vars for managed_node3 11701 1727096123.87958: Calling groups_inventory to load vars for managed_node3 11701 1727096123.87960: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096123.87978: Calling all_plugins_play to load vars for managed_node3 11701 1727096123.87980: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096123.87983: Calling groups_plugins_play to load vars for managed_node3 11701 1727096123.88124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096123.88271: done with get_vars() 11701 1727096123.88278: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:23 -0400 (0:00:01.765) 0:00:07.848 ****** 11701 1727096123.88345: entering _queue_task() for managed_node3/include_tasks 11701 1727096123.88561: worker is 1 (out of 1 available) 11701 1727096123.88576: exiting _queue_task() for managed_node3/include_tasks 11701 1727096123.88588: done queuing things up, now waiting for results queue to drain 11701 1727096123.88589: waiting for pending results... 11701 1727096123.88732: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11701 1727096123.88797: in run() - task 0afff68d-5257-a05c-c957-000000000016 11701 1727096123.88810: variable 'ansible_search_path' from source: unknown 11701 1727096123.88815: variable 'ansible_search_path' from source: unknown 11701 1727096123.88841: calling self._execute() 11701 1727096123.88903: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096123.88907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096123.88915: variable 'omit' from source: magic vars 11701 1727096123.89181: variable 'ansible_distribution_major_version' from source: facts 11701 1727096123.89192: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096123.89197: _execute() done 11701 1727096123.89200: dumping result to json 11701 1727096123.89202: done dumping result, returning 11701 1727096123.89209: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a05c-c957-000000000016] 11701 1727096123.89214: sending task result for task 0afff68d-5257-a05c-c957-000000000016 11701 1727096123.89301: done sending task result for task 0afff68d-5257-a05c-c957-000000000016 11701 1727096123.89304: WORKER PROCESS EXITING 11701 1727096123.89333: no more pending results, returning what we have 11701 1727096123.89338: in VariableManager get_vars() 11701 1727096123.89388: Calling all_inventory to load vars for managed_node3 11701 1727096123.89391: Calling groups_inventory to load vars for managed_node3 11701 1727096123.89394: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096123.89405: Calling all_plugins_play to load vars for managed_node3 11701 1727096123.89407: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096123.89410: Calling groups_plugins_play to load vars for managed_node3 11701 1727096123.89545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096123.89665: done with get_vars() 11701 1727096123.89674: variable 'ansible_search_path' from source: unknown 11701 1727096123.89678: variable 'ansible_search_path' from source: unknown 11701 1727096123.89706: we have included files to process 11701 1727096123.89707: generating all_blocks data 11701 1727096123.89708: done generating all_blocks data 11701 1727096123.89709: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096123.89709: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096123.89711: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096123.89874: done processing included file 11701 1727096123.89876: iterating over new_blocks loaded from include file 11701 1727096123.89877: in VariableManager get_vars() 11701 1727096123.89890: done with get_vars() 11701 1727096123.89891: filtering new block on tags 11701 1727096123.89901: done filtering new block on tags 11701 1727096123.89903: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11701 1727096123.89907: extending task lists for all hosts with included blocks 11701 1727096123.89969: done extending task lists 11701 1727096123.89970: done processing included files 11701 1727096123.89970: results queue empty 11701 1727096123.89971: checking for any_errors_fatal 11701 1727096123.89975: done checking for any_errors_fatal 11701 1727096123.89976: checking for max_fail_percentage 11701 1727096123.89976: done checking for max_fail_percentage 11701 1727096123.89977: checking to see if all hosts have failed and the running result is not ok 11701 1727096123.89977: done checking to see if all hosts have failed 11701 1727096123.89978: getting the remaining hosts for this loop 11701 1727096123.89978: done getting the remaining hosts for this loop 11701 1727096123.89980: getting the next task for host managed_node3 11701 1727096123.89982: done getting next task for host managed_node3 11701 1727096123.89984: ^ task is: TASK: Get stat for interface {{ interface }} 11701 1727096123.89986: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096123.89987: getting variables 11701 1727096123.89988: in VariableManager get_vars() 11701 1727096123.89996: Calling all_inventory to load vars for managed_node3 11701 1727096123.89998: Calling groups_inventory to load vars for managed_node3 11701 1727096123.89999: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096123.90003: Calling all_plugins_play to load vars for managed_node3 11701 1727096123.90004: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096123.90006: Calling groups_plugins_play to load vars for managed_node3 11701 1727096123.90114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096123.90224: done with get_vars() 11701 1727096123.90233: done getting variables 11701 1727096123.90347: variable 'interface' from source: task vars 11701 1727096123.90353: variable 'dhcp_interface1' from source: play vars 11701 1727096123.90398: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:23 -0400 (0:00:00.020) 0:00:07.868 ****** 11701 1727096123.90427: entering _queue_task() for managed_node3/stat 11701 1727096123.90654: worker is 1 (out of 1 available) 11701 1727096123.90669: exiting _queue_task() for managed_node3/stat 11701 1727096123.90680: done queuing things up, now waiting for results queue to drain 11701 1727096123.90681: waiting for pending results... 11701 1727096123.90828: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 11701 1727096123.90899: in run() - task 0afff68d-5257-a05c-c957-000000000152 11701 1727096123.90914: variable 'ansible_search_path' from source: unknown 11701 1727096123.90918: variable 'ansible_search_path' from source: unknown 11701 1727096123.90943: calling self._execute() 11701 1727096123.91005: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096123.91009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096123.91018: variable 'omit' from source: magic vars 11701 1727096123.91282: variable 'ansible_distribution_major_version' from source: facts 11701 1727096123.91292: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096123.91298: variable 'omit' from source: magic vars 11701 1727096123.91332: variable 'omit' from source: magic vars 11701 1727096123.91402: variable 'interface' from source: task vars 11701 1727096123.91406: variable 'dhcp_interface1' from source: play vars 11701 1727096123.91459: variable 'dhcp_interface1' from source: play vars 11701 1727096123.91469: variable 'omit' from source: magic vars 11701 1727096123.91503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096123.91529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096123.91545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096123.91561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096123.91574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096123.91598: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096123.91606: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096123.91608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096123.91682: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096123.91686: Set connection var ansible_timeout to 10 11701 1727096123.91688: Set connection var ansible_shell_type to sh 11701 1727096123.91691: Set connection var ansible_shell_executable to /bin/sh 11701 1727096123.91693: Set connection var ansible_connection to ssh 11701 1727096123.91701: Set connection var ansible_pipelining to False 11701 1727096123.91718: variable 'ansible_shell_executable' from source: unknown 11701 1727096123.91720: variable 'ansible_connection' from source: unknown 11701 1727096123.91723: variable 'ansible_module_compression' from source: unknown 11701 1727096123.91725: variable 'ansible_shell_type' from source: unknown 11701 1727096123.91727: variable 'ansible_shell_executable' from source: unknown 11701 1727096123.91729: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096123.91733: variable 'ansible_pipelining' from source: unknown 11701 1727096123.91736: variable 'ansible_timeout' from source: unknown 11701 1727096123.91740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096123.91894: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096123.91898: variable 'omit' from source: magic vars 11701 1727096123.91906: starting attempt loop 11701 1727096123.91909: running the handler 11701 1727096123.91919: _low_level_execute_command(): starting 11701 1727096123.91925: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096123.92444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096123.92449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096123.92453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096123.92510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096123.92513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096123.92516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096123.92558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096123.94242: stdout chunk (state=3): >>>/root <<< 11701 1727096123.94331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096123.94365: stderr chunk (state=3): >>><<< 11701 1727096123.94370: stdout chunk (state=3): >>><<< 11701 1727096123.94396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096123.94407: _low_level_execute_command(): starting 11701 1727096123.94413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268 `" && echo ansible-tmp-1727096123.9439595-12130-227382104795268="` echo /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268 `" ) && sleep 0' 11701 1727096123.94872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096123.94875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096123.94886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096123.94889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096123.94936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096123.94939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096123.94948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096123.94982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096123.97018: stdout chunk (state=3): >>>ansible-tmp-1727096123.9439595-12130-227382104795268=/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268 <<< 11701 1727096123.97284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096123.97307: stderr chunk (state=3): >>><<< 11701 1727096123.97317: stdout chunk (state=3): >>><<< 11701 1727096123.97344: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096123.9439595-12130-227382104795268=/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096123.97419: variable 'ansible_module_compression' from source: unknown 11701 1727096123.97501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096123.97545: variable 'ansible_facts' from source: unknown 11701 1727096123.97698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py 11701 1727096123.97897: Sending initial data 11701 1727096123.97900: Sent initial data (153 bytes) 11701 1727096123.98579: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096123.98595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096123.98606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096123.98646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096123.98664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096123.98710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.00364: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096124.00400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096124.00428: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpkqfhfpq2 /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py <<< 11701 1727096124.00431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py" <<< 11701 1727096124.00457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpkqfhfpq2" to remote "/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py" <<< 11701 1727096124.00464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py" <<< 11701 1727096124.00946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.00995: stderr chunk (state=3): >>><<< 11701 1727096124.00999: stdout chunk (state=3): >>><<< 11701 1727096124.01023: done transferring module to remote 11701 1727096124.01033: _low_level_execute_command(): starting 11701 1727096124.01037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/ /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py && sleep 0' 11701 1727096124.01503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.01506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096124.01508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.01511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.01517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.01564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.01573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.01606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.03474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.03504: stderr chunk (state=3): >>><<< 11701 1727096124.03507: stdout chunk (state=3): >>><<< 11701 1727096124.03523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.03526: _low_level_execute_command(): starting 11701 1727096124.03530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/AnsiballZ_stat.py && sleep 0' 11701 1727096124.03997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.04001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.04003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096124.04005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.04063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.04072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.04074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.04110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.20037: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1727096122.6088595, "mtime": 1727096122.6088595, "ctime": 1727096122.6088595, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096124.21500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096124.21527: stderr chunk (state=3): >>><<< 11701 1727096124.21536: stdout chunk (state=3): >>><<< 11701 1727096124.21700: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1727096122.6088595, "mtime": 1727096122.6088595, "ctime": 1727096122.6088595, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096124.21704: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096124.21712: _low_level_execute_command(): starting 11701 1727096124.21714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096123.9439595-12130-227382104795268/ > /dev/null 2>&1 && sleep 0' 11701 1727096124.22282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.22371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.22399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.22419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.22500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.24471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.24494: stderr chunk (state=3): >>><<< 11701 1727096124.24504: stdout chunk (state=3): >>><<< 11701 1727096124.24673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.24677: handler run complete 11701 1727096124.24680: attempt loop complete, returning result 11701 1727096124.24682: _execute() done 11701 1727096124.24684: dumping result to json 11701 1727096124.24686: done dumping result, returning 11701 1727096124.24689: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0afff68d-5257-a05c-c957-000000000152] 11701 1727096124.24691: sending task result for task 0afff68d-5257-a05c-c957-000000000152 11701 1727096124.24763: done sending task result for task 0afff68d-5257-a05c-c957-000000000152 11701 1727096124.24766: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096122.6088595, "block_size": 4096, "blocks": 0, "ctime": 1727096122.6088595, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27482, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727096122.6088595, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11701 1727096124.25056: no more pending results, returning what we have 11701 1727096124.25059: results queue empty 11701 1727096124.25060: checking for any_errors_fatal 11701 1727096124.25061: done checking for any_errors_fatal 11701 1727096124.25062: checking for max_fail_percentage 11701 1727096124.25064: done checking for max_fail_percentage 11701 1727096124.25065: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.25066: done checking to see if all hosts have failed 11701 1727096124.25066: getting the remaining hosts for this loop 11701 1727096124.25073: done getting the remaining hosts for this loop 11701 1727096124.25077: getting the next task for host managed_node3 11701 1727096124.25084: done getting next task for host managed_node3 11701 1727096124.25087: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11701 1727096124.25089: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.25094: getting variables 11701 1727096124.25096: in VariableManager get_vars() 11701 1727096124.25133: Calling all_inventory to load vars for managed_node3 11701 1727096124.25135: Calling groups_inventory to load vars for managed_node3 11701 1727096124.25138: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.25148: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.25150: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.25153: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.25424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.25632: done with get_vars() 11701 1727096124.25643: done getting variables 11701 1727096124.25741: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 11701 1727096124.25863: variable 'interface' from source: task vars 11701 1727096124.25868: variable 'dhcp_interface1' from source: play vars 11701 1727096124.25932: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:24 -0400 (0:00:00.355) 0:00:08.224 ****** 11701 1727096124.25963: entering _queue_task() for managed_node3/assert 11701 1727096124.25965: Creating lock for assert 11701 1727096124.26363: worker is 1 (out of 1 available) 11701 1727096124.26376: exiting _queue_task() for managed_node3/assert 11701 1727096124.26385: done queuing things up, now waiting for results queue to drain 11701 1727096124.26386: waiting for pending results... 11701 1727096124.26688: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 11701 1727096124.26693: in run() - task 0afff68d-5257-a05c-c957-000000000017 11701 1727096124.26696: variable 'ansible_search_path' from source: unknown 11701 1727096124.26699: variable 'ansible_search_path' from source: unknown 11701 1727096124.26714: calling self._execute() 11701 1727096124.26799: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.26810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.26891: variable 'omit' from source: magic vars 11701 1727096124.27255: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.27275: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.27285: variable 'omit' from source: magic vars 11701 1727096124.27337: variable 'omit' from source: magic vars 11701 1727096124.27437: variable 'interface' from source: task vars 11701 1727096124.27451: variable 'dhcp_interface1' from source: play vars 11701 1727096124.27517: variable 'dhcp_interface1' from source: play vars 11701 1727096124.27544: variable 'omit' from source: magic vars 11701 1727096124.27594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096124.27635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096124.27670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096124.27692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.27759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.27762: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096124.27769: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.27772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.27857: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096124.27877: Set connection var ansible_timeout to 10 11701 1727096124.27884: Set connection var ansible_shell_type to sh 11701 1727096124.27894: Set connection var ansible_shell_executable to /bin/sh 11701 1727096124.27902: Set connection var ansible_connection to ssh 11701 1727096124.27915: Set connection var ansible_pipelining to False 11701 1727096124.27938: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.27945: variable 'ansible_connection' from source: unknown 11701 1727096124.27973: variable 'ansible_module_compression' from source: unknown 11701 1727096124.27979: variable 'ansible_shell_type' from source: unknown 11701 1727096124.27982: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.27984: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.27986: variable 'ansible_pipelining' from source: unknown 11701 1727096124.27988: variable 'ansible_timeout' from source: unknown 11701 1727096124.27990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.28191: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096124.28196: variable 'omit' from source: magic vars 11701 1727096124.28198: starting attempt loop 11701 1727096124.28200: running the handler 11701 1727096124.28299: variable 'interface_stat' from source: set_fact 11701 1727096124.28425: Evaluated conditional (interface_stat.stat.exists): True 11701 1727096124.28428: handler run complete 11701 1727096124.28430: attempt loop complete, returning result 11701 1727096124.28432: _execute() done 11701 1727096124.28434: dumping result to json 11701 1727096124.28437: done dumping result, returning 11701 1727096124.28439: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0afff68d-5257-a05c-c957-000000000017] 11701 1727096124.28441: sending task result for task 0afff68d-5257-a05c-c957-000000000017 11701 1727096124.28510: done sending task result for task 0afff68d-5257-a05c-c957-000000000017 11701 1727096124.28513: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096124.28564: no more pending results, returning what we have 11701 1727096124.28569: results queue empty 11701 1727096124.28571: checking for any_errors_fatal 11701 1727096124.28579: done checking for any_errors_fatal 11701 1727096124.28580: checking for max_fail_percentage 11701 1727096124.28582: done checking for max_fail_percentage 11701 1727096124.28583: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.28584: done checking to see if all hosts have failed 11701 1727096124.28584: getting the remaining hosts for this loop 11701 1727096124.28586: done getting the remaining hosts for this loop 11701 1727096124.28588: getting the next task for host managed_node3 11701 1727096124.28596: done getting next task for host managed_node3 11701 1727096124.28599: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11701 1727096124.28602: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.28605: getting variables 11701 1727096124.28607: in VariableManager get_vars() 11701 1727096124.28645: Calling all_inventory to load vars for managed_node3 11701 1727096124.28647: Calling groups_inventory to load vars for managed_node3 11701 1727096124.28650: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.28660: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.28662: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.28665: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.29188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.29465: done with get_vars() 11701 1727096124.29479: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:24 -0400 (0:00:00.036) 0:00:08.260 ****** 11701 1727096124.29585: entering _queue_task() for managed_node3/include_tasks 11701 1727096124.30001: worker is 1 (out of 1 available) 11701 1727096124.30013: exiting _queue_task() for managed_node3/include_tasks 11701 1727096124.30023: done queuing things up, now waiting for results queue to drain 11701 1727096124.30024: waiting for pending results... 11701 1727096124.30482: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11701 1727096124.30487: in run() - task 0afff68d-5257-a05c-c957-00000000001b 11701 1727096124.30489: variable 'ansible_search_path' from source: unknown 11701 1727096124.30491: variable 'ansible_search_path' from source: unknown 11701 1727096124.30494: calling self._execute() 11701 1727096124.30559: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.30576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.30590: variable 'omit' from source: magic vars 11701 1727096124.30943: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.30962: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.30975: _execute() done 11701 1727096124.30983: dumping result to json 11701 1727096124.30990: done dumping result, returning 11701 1727096124.31000: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a05c-c957-00000000001b] 11701 1727096124.31015: sending task result for task 0afff68d-5257-a05c-c957-00000000001b 11701 1727096124.31188: no more pending results, returning what we have 11701 1727096124.31195: in VariableManager get_vars() 11701 1727096124.31465: Calling all_inventory to load vars for managed_node3 11701 1727096124.31470: Calling groups_inventory to load vars for managed_node3 11701 1727096124.31473: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.31479: done sending task result for task 0afff68d-5257-a05c-c957-00000000001b 11701 1727096124.31482: WORKER PROCESS EXITING 11701 1727096124.31492: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.31495: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.31497: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.31686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.32119: done with get_vars() 11701 1727096124.32125: variable 'ansible_search_path' from source: unknown 11701 1727096124.32126: variable 'ansible_search_path' from source: unknown 11701 1727096124.32160: we have included files to process 11701 1727096124.32161: generating all_blocks data 11701 1727096124.32163: done generating all_blocks data 11701 1727096124.32165: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096124.32166: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096124.32170: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096124.32350: done processing included file 11701 1727096124.32352: iterating over new_blocks loaded from include file 11701 1727096124.32353: in VariableManager get_vars() 11701 1727096124.32373: done with get_vars() 11701 1727096124.32375: filtering new block on tags 11701 1727096124.32390: done filtering new block on tags 11701 1727096124.32392: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11701 1727096124.32397: extending task lists for all hosts with included blocks 11701 1727096124.32491: done extending task lists 11701 1727096124.32492: done processing included files 11701 1727096124.32493: results queue empty 11701 1727096124.32494: checking for any_errors_fatal 11701 1727096124.32496: done checking for any_errors_fatal 11701 1727096124.32497: checking for max_fail_percentage 11701 1727096124.32498: done checking for max_fail_percentage 11701 1727096124.32499: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.32500: done checking to see if all hosts have failed 11701 1727096124.32500: getting the remaining hosts for this loop 11701 1727096124.32501: done getting the remaining hosts for this loop 11701 1727096124.32504: getting the next task for host managed_node3 11701 1727096124.32507: done getting next task for host managed_node3 11701 1727096124.32509: ^ task is: TASK: Get stat for interface {{ interface }} 11701 1727096124.32512: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.32514: getting variables 11701 1727096124.32515: in VariableManager get_vars() 11701 1727096124.32527: Calling all_inventory to load vars for managed_node3 11701 1727096124.32529: Calling groups_inventory to load vars for managed_node3 11701 1727096124.32531: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.32536: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.32538: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.32541: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.32685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.32880: done with get_vars() 11701 1727096124.32889: done getting variables 11701 1727096124.33047: variable 'interface' from source: task vars 11701 1727096124.33051: variable 'dhcp_interface2' from source: play vars 11701 1727096124.33116: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:24 -0400 (0:00:00.035) 0:00:08.296 ****** 11701 1727096124.33145: entering _queue_task() for managed_node3/stat 11701 1727096124.33677: worker is 1 (out of 1 available) 11701 1727096124.33686: exiting _queue_task() for managed_node3/stat 11701 1727096124.33694: done queuing things up, now waiting for results queue to drain 11701 1727096124.33696: waiting for pending results... 11701 1727096124.33825: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 11701 1727096124.33874: in run() - task 0afff68d-5257-a05c-c957-00000000016a 11701 1727096124.33893: variable 'ansible_search_path' from source: unknown 11701 1727096124.33901: variable 'ansible_search_path' from source: unknown 11701 1727096124.33948: calling self._execute() 11701 1727096124.34038: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.34052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.34065: variable 'omit' from source: magic vars 11701 1727096124.34470: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.34474: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.34480: variable 'omit' from source: magic vars 11701 1727096124.34525: variable 'omit' from source: magic vars 11701 1727096124.34674: variable 'interface' from source: task vars 11701 1727096124.34683: variable 'dhcp_interface2' from source: play vars 11701 1727096124.34746: variable 'dhcp_interface2' from source: play vars 11701 1727096124.34773: variable 'omit' from source: magic vars 11701 1727096124.34857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096124.34907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096124.35011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096124.35014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.35017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.35037: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096124.35046: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.35054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.35175: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096124.35186: Set connection var ansible_timeout to 10 11701 1727096124.35194: Set connection var ansible_shell_type to sh 11701 1727096124.35204: Set connection var ansible_shell_executable to /bin/sh 11701 1727096124.35211: Set connection var ansible_connection to ssh 11701 1727096124.35230: Set connection var ansible_pipelining to False 11701 1727096124.35368: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.35374: variable 'ansible_connection' from source: unknown 11701 1727096124.35446: variable 'ansible_module_compression' from source: unknown 11701 1727096124.35449: variable 'ansible_shell_type' from source: unknown 11701 1727096124.35451: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.35454: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.35456: variable 'ansible_pipelining' from source: unknown 11701 1727096124.35460: variable 'ansible_timeout' from source: unknown 11701 1727096124.35463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.35693: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096124.35715: variable 'omit' from source: magic vars 11701 1727096124.35787: starting attempt loop 11701 1727096124.35790: running the handler 11701 1727096124.35793: _low_level_execute_command(): starting 11701 1727096124.35795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096124.36754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096124.36813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.36870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.36888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.36919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.36995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.38701: stdout chunk (state=3): >>>/root <<< 11701 1727096124.38859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.38863: stdout chunk (state=3): >>><<< 11701 1727096124.38865: stderr chunk (state=3): >>><<< 11701 1727096124.38892: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.38918: _low_level_execute_command(): starting 11701 1727096124.39005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744 `" && echo ansible-tmp-1727096124.3890288-12153-75047242042744="` echo /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744 `" ) && sleep 0' 11701 1727096124.39716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.39742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.39777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.39847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.41837: stdout chunk (state=3): >>>ansible-tmp-1727096124.3890288-12153-75047242042744=/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744 <<< 11701 1727096124.42006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.42010: stdout chunk (state=3): >>><<< 11701 1727096124.42012: stderr chunk (state=3): >>><<< 11701 1727096124.42031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096124.3890288-12153-75047242042744=/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.42173: variable 'ansible_module_compression' from source: unknown 11701 1727096124.42176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096124.42203: variable 'ansible_facts' from source: unknown 11701 1727096124.42313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py 11701 1727096124.42494: Sending initial data 11701 1727096124.42497: Sent initial data (152 bytes) 11701 1727096124.43127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096124.43177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096124.43192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096124.43287: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.43307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.43376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.45048: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096124.45141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096124.45196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpsy93v3w3 /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py <<< 11701 1727096124.45199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py" <<< 11701 1727096124.45374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpsy93v3w3" to remote "/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py" <<< 11701 1727096124.46186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.46205: stderr chunk (state=3): >>><<< 11701 1727096124.46212: stdout chunk (state=3): >>><<< 11701 1727096124.46245: done transferring module to remote 11701 1727096124.46305: _low_level_execute_command(): starting 11701 1727096124.46309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/ /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py && sleep 0' 11701 1727096124.46974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.47034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.47051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.47087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.47142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.49086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.49125: stdout chunk (state=3): >>><<< 11701 1727096124.49128: stderr chunk (state=3): >>><<< 11701 1727096124.49146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.49257: _low_level_execute_command(): starting 11701 1727096124.49261: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/AnsiballZ_stat.py && sleep 0' 11701 1727096124.50029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096124.50106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.50190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.50213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.50238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.50348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.66019: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1727096122.6155634, "mtime": 1727096122.6155634, "ctime": 1727096122.6155634, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096124.67427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096124.67453: stderr chunk (state=3): >>><<< 11701 1727096124.67460: stdout chunk (state=3): >>><<< 11701 1727096124.67485: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1727096122.6155634, "mtime": 1727096122.6155634, "ctime": 1727096122.6155634, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096124.67520: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096124.67528: _low_level_execute_command(): starting 11701 1727096124.67533: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096124.3890288-12153-75047242042744/ > /dev/null 2>&1 && sleep 0' 11701 1727096124.67987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.67991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.67993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096124.67996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.68059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.68062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.68065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.68092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.69947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.69977: stderr chunk (state=3): >>><<< 11701 1727096124.69981: stdout chunk (state=3): >>><<< 11701 1727096124.69996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.70004: handler run complete 11701 1727096124.70032: attempt loop complete, returning result 11701 1727096124.70035: _execute() done 11701 1727096124.70038: dumping result to json 11701 1727096124.70043: done dumping result, returning 11701 1727096124.70056: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0afff68d-5257-a05c-c957-00000000016a] 11701 1727096124.70058: sending task result for task 0afff68d-5257-a05c-c957-00000000016a 11701 1727096124.70161: done sending task result for task 0afff68d-5257-a05c-c957-00000000016a 11701 1727096124.70164: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096122.6155634, "block_size": 4096, "blocks": 0, "ctime": 1727096122.6155634, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27888, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727096122.6155634, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11701 1727096124.70256: no more pending results, returning what we have 11701 1727096124.70259: results queue empty 11701 1727096124.70260: checking for any_errors_fatal 11701 1727096124.70261: done checking for any_errors_fatal 11701 1727096124.70262: checking for max_fail_percentage 11701 1727096124.70264: done checking for max_fail_percentage 11701 1727096124.70265: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.70266: done checking to see if all hosts have failed 11701 1727096124.70267: getting the remaining hosts for this loop 11701 1727096124.70275: done getting the remaining hosts for this loop 11701 1727096124.70279: getting the next task for host managed_node3 11701 1727096124.70286: done getting next task for host managed_node3 11701 1727096124.70288: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11701 1727096124.70291: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.70295: getting variables 11701 1727096124.70296: in VariableManager get_vars() 11701 1727096124.70332: Calling all_inventory to load vars for managed_node3 11701 1727096124.70334: Calling groups_inventory to load vars for managed_node3 11701 1727096124.70336: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.70345: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.70348: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.70353: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.70524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.70644: done with get_vars() 11701 1727096124.70655: done getting variables 11701 1727096124.70700: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096124.70790: variable 'interface' from source: task vars 11701 1727096124.70793: variable 'dhcp_interface2' from source: play vars 11701 1727096124.70836: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:24 -0400 (0:00:00.377) 0:00:08.673 ****** 11701 1727096124.70861: entering _queue_task() for managed_node3/assert 11701 1727096124.71084: worker is 1 (out of 1 available) 11701 1727096124.71097: exiting _queue_task() for managed_node3/assert 11701 1727096124.71107: done queuing things up, now waiting for results queue to drain 11701 1727096124.71109: waiting for pending results... 11701 1727096124.71265: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 11701 1727096124.71339: in run() - task 0afff68d-5257-a05c-c957-00000000001c 11701 1727096124.71360: variable 'ansible_search_path' from source: unknown 11701 1727096124.71364: variable 'ansible_search_path' from source: unknown 11701 1727096124.71388: calling self._execute() 11701 1727096124.71452: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.71456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.71464: variable 'omit' from source: magic vars 11701 1727096124.71727: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.71738: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.71747: variable 'omit' from source: magic vars 11701 1727096124.71782: variable 'omit' from source: magic vars 11701 1727096124.71850: variable 'interface' from source: task vars 11701 1727096124.71856: variable 'dhcp_interface2' from source: play vars 11701 1727096124.71906: variable 'dhcp_interface2' from source: play vars 11701 1727096124.71921: variable 'omit' from source: magic vars 11701 1727096124.71953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096124.71984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096124.72001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096124.72017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.72027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.72049: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096124.72055: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.72058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.72130: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096124.72135: Set connection var ansible_timeout to 10 11701 1727096124.72138: Set connection var ansible_shell_type to sh 11701 1727096124.72143: Set connection var ansible_shell_executable to /bin/sh 11701 1727096124.72146: Set connection var ansible_connection to ssh 11701 1727096124.72156: Set connection var ansible_pipelining to False 11701 1727096124.72173: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.72176: variable 'ansible_connection' from source: unknown 11701 1727096124.72179: variable 'ansible_module_compression' from source: unknown 11701 1727096124.72181: variable 'ansible_shell_type' from source: unknown 11701 1727096124.72183: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.72185: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.72190: variable 'ansible_pipelining' from source: unknown 11701 1727096124.72194: variable 'ansible_timeout' from source: unknown 11701 1727096124.72196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.72299: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096124.72309: variable 'omit' from source: magic vars 11701 1727096124.72312: starting attempt loop 11701 1727096124.72315: running the handler 11701 1727096124.72405: variable 'interface_stat' from source: set_fact 11701 1727096124.72420: Evaluated conditional (interface_stat.stat.exists): True 11701 1727096124.72425: handler run complete 11701 1727096124.72436: attempt loop complete, returning result 11701 1727096124.72440: _execute() done 11701 1727096124.72443: dumping result to json 11701 1727096124.72445: done dumping result, returning 11701 1727096124.72454: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0afff68d-5257-a05c-c957-00000000001c] 11701 1727096124.72457: sending task result for task 0afff68d-5257-a05c-c957-00000000001c 11701 1727096124.72537: done sending task result for task 0afff68d-5257-a05c-c957-00000000001c 11701 1727096124.72540: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096124.72593: no more pending results, returning what we have 11701 1727096124.72596: results queue empty 11701 1727096124.72597: checking for any_errors_fatal 11701 1727096124.72605: done checking for any_errors_fatal 11701 1727096124.72606: checking for max_fail_percentage 11701 1727096124.72608: done checking for max_fail_percentage 11701 1727096124.72609: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.72610: done checking to see if all hosts have failed 11701 1727096124.72611: getting the remaining hosts for this loop 11701 1727096124.72612: done getting the remaining hosts for this loop 11701 1727096124.72615: getting the next task for host managed_node3 11701 1727096124.72621: done getting next task for host managed_node3 11701 1727096124.72624: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11701 1727096124.72627: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.72630: getting variables 11701 1727096124.72632: in VariableManager get_vars() 11701 1727096124.72673: Calling all_inventory to load vars for managed_node3 11701 1727096124.72682: Calling groups_inventory to load vars for managed_node3 11701 1727096124.72685: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.72694: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.72696: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.72698: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.72836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.72953: done with get_vars() 11701 1727096124.72961: done getting variables 11701 1727096124.73003: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Monday 23 September 2024 08:55:24 -0400 (0:00:00.021) 0:00:08.694 ****** 11701 1727096124.73023: entering _queue_task() for managed_node3/command 11701 1727096124.73230: worker is 1 (out of 1 available) 11701 1727096124.73243: exiting _queue_task() for managed_node3/command 11701 1727096124.73254: done queuing things up, now waiting for results queue to drain 11701 1727096124.73256: waiting for pending results... 11701 1727096124.73413: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 11701 1727096124.73473: in run() - task 0afff68d-5257-a05c-c957-00000000001d 11701 1727096124.73488: variable 'ansible_search_path' from source: unknown 11701 1727096124.73513: calling self._execute() 11701 1727096124.73577: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.73581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.73596: variable 'omit' from source: magic vars 11701 1727096124.73904: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.73915: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.73996: variable 'network_provider' from source: set_fact 11701 1727096124.73999: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096124.74002: when evaluation is False, skipping this task 11701 1727096124.74011: _execute() done 11701 1727096124.74014: dumping result to json 11701 1727096124.74016: done dumping result, returning 11701 1727096124.74022: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [0afff68d-5257-a05c-c957-00000000001d] 11701 1727096124.74032: sending task result for task 0afff68d-5257-a05c-c957-00000000001d 11701 1727096124.74107: done sending task result for task 0afff68d-5257-a05c-c957-00000000001d 11701 1727096124.74110: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11701 1727096124.74184: no more pending results, returning what we have 11701 1727096124.74188: results queue empty 11701 1727096124.74189: checking for any_errors_fatal 11701 1727096124.74195: done checking for any_errors_fatal 11701 1727096124.74195: checking for max_fail_percentage 11701 1727096124.74197: done checking for max_fail_percentage 11701 1727096124.74198: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.74198: done checking to see if all hosts have failed 11701 1727096124.74199: getting the remaining hosts for this loop 11701 1727096124.74200: done getting the remaining hosts for this loop 11701 1727096124.74204: getting the next task for host managed_node3 11701 1727096124.74209: done getting next task for host managed_node3 11701 1727096124.74211: ^ task is: TASK: TEST Add Bond with 2 ports 11701 1727096124.74213: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.74215: getting variables 11701 1727096124.74217: in VariableManager get_vars() 11701 1727096124.74257: Calling all_inventory to load vars for managed_node3 11701 1727096124.74260: Calling groups_inventory to load vars for managed_node3 11701 1727096124.74262: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.74272: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.74275: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.74278: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.74426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.74540: done with get_vars() 11701 1727096124.74548: done getting variables 11701 1727096124.74593: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Monday 23 September 2024 08:55:24 -0400 (0:00:00.015) 0:00:08.710 ****** 11701 1727096124.74613: entering _queue_task() for managed_node3/debug 11701 1727096124.74821: worker is 1 (out of 1 available) 11701 1727096124.74834: exiting _queue_task() for managed_node3/debug 11701 1727096124.74844: done queuing things up, now waiting for results queue to drain 11701 1727096124.74845: waiting for pending results... 11701 1727096124.75003: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 11701 1727096124.75063: in run() - task 0afff68d-5257-a05c-c957-00000000001e 11701 1727096124.75078: variable 'ansible_search_path' from source: unknown 11701 1727096124.75111: calling self._execute() 11701 1727096124.75175: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.75187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.75191: variable 'omit' from source: magic vars 11701 1727096124.75456: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.75464: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.75472: variable 'omit' from source: magic vars 11701 1727096124.75487: variable 'omit' from source: magic vars 11701 1727096124.75514: variable 'omit' from source: magic vars 11701 1727096124.75547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096124.75576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096124.75593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096124.75607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.75622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.75643: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096124.75646: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.75649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.75718: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096124.75724: Set connection var ansible_timeout to 10 11701 1727096124.75726: Set connection var ansible_shell_type to sh 11701 1727096124.75729: Set connection var ansible_shell_executable to /bin/sh 11701 1727096124.75731: Set connection var ansible_connection to ssh 11701 1727096124.75741: Set connection var ansible_pipelining to False 11701 1727096124.75758: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.75761: variable 'ansible_connection' from source: unknown 11701 1727096124.75763: variable 'ansible_module_compression' from source: unknown 11701 1727096124.75766: variable 'ansible_shell_type' from source: unknown 11701 1727096124.75771: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.75773: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.75775: variable 'ansible_pipelining' from source: unknown 11701 1727096124.75778: variable 'ansible_timeout' from source: unknown 11701 1727096124.75782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.75886: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096124.75895: variable 'omit' from source: magic vars 11701 1727096124.75900: starting attempt loop 11701 1727096124.75903: running the handler 11701 1727096124.75939: handler run complete 11701 1727096124.75959: attempt loop complete, returning result 11701 1727096124.75962: _execute() done 11701 1727096124.75965: dumping result to json 11701 1727096124.75969: done dumping result, returning 11701 1727096124.75971: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [0afff68d-5257-a05c-c957-00000000001e] 11701 1727096124.75973: sending task result for task 0afff68d-5257-a05c-c957-00000000001e 11701 1727096124.76048: done sending task result for task 0afff68d-5257-a05c-c957-00000000001e 11701 1727096124.76053: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 11701 1727096124.76106: no more pending results, returning what we have 11701 1727096124.76110: results queue empty 11701 1727096124.76111: checking for any_errors_fatal 11701 1727096124.76115: done checking for any_errors_fatal 11701 1727096124.76116: checking for max_fail_percentage 11701 1727096124.76117: done checking for max_fail_percentage 11701 1727096124.76118: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.76119: done checking to see if all hosts have failed 11701 1727096124.76120: getting the remaining hosts for this loop 11701 1727096124.76121: done getting the remaining hosts for this loop 11701 1727096124.76124: getting the next task for host managed_node3 11701 1727096124.76131: done getting next task for host managed_node3 11701 1727096124.76136: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11701 1727096124.76139: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.76157: getting variables 11701 1727096124.76158: in VariableManager get_vars() 11701 1727096124.76197: Calling all_inventory to load vars for managed_node3 11701 1727096124.76200: Calling groups_inventory to load vars for managed_node3 11701 1727096124.76202: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.76210: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.76212: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.76214: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.76345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.76495: done with get_vars() 11701 1727096124.76503: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:24 -0400 (0:00:00.019) 0:00:08.730 ****** 11701 1727096124.76570: entering _queue_task() for managed_node3/include_tasks 11701 1727096124.76793: worker is 1 (out of 1 available) 11701 1727096124.76805: exiting _queue_task() for managed_node3/include_tasks 11701 1727096124.76816: done queuing things up, now waiting for results queue to drain 11701 1727096124.76817: waiting for pending results... 11701 1727096124.76982: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11701 1727096124.77069: in run() - task 0afff68d-5257-a05c-c957-000000000026 11701 1727096124.77081: variable 'ansible_search_path' from source: unknown 11701 1727096124.77084: variable 'ansible_search_path' from source: unknown 11701 1727096124.77112: calling self._execute() 11701 1727096124.77179: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.77183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.77191: variable 'omit' from source: magic vars 11701 1727096124.77452: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.77464: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.77470: _execute() done 11701 1727096124.77474: dumping result to json 11701 1727096124.77478: done dumping result, returning 11701 1727096124.77489: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a05c-c957-000000000026] 11701 1727096124.77492: sending task result for task 0afff68d-5257-a05c-c957-000000000026 11701 1727096124.77572: done sending task result for task 0afff68d-5257-a05c-c957-000000000026 11701 1727096124.77575: WORKER PROCESS EXITING 11701 1727096124.77631: no more pending results, returning what we have 11701 1727096124.77635: in VariableManager get_vars() 11701 1727096124.77682: Calling all_inventory to load vars for managed_node3 11701 1727096124.77685: Calling groups_inventory to load vars for managed_node3 11701 1727096124.77688: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.77700: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.77702: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.77705: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.77862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.77989: done with get_vars() 11701 1727096124.77996: variable 'ansible_search_path' from source: unknown 11701 1727096124.77997: variable 'ansible_search_path' from source: unknown 11701 1727096124.78023: we have included files to process 11701 1727096124.78024: generating all_blocks data 11701 1727096124.78025: done generating all_blocks data 11701 1727096124.78028: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096124.78029: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096124.78030: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096124.78514: done processing included file 11701 1727096124.78516: iterating over new_blocks loaded from include file 11701 1727096124.78517: in VariableManager get_vars() 11701 1727096124.78534: done with get_vars() 11701 1727096124.78536: filtering new block on tags 11701 1727096124.78548: done filtering new block on tags 11701 1727096124.78550: in VariableManager get_vars() 11701 1727096124.78564: done with get_vars() 11701 1727096124.78565: filtering new block on tags 11701 1727096124.78579: done filtering new block on tags 11701 1727096124.78581: in VariableManager get_vars() 11701 1727096124.78593: done with get_vars() 11701 1727096124.78594: filtering new block on tags 11701 1727096124.78605: done filtering new block on tags 11701 1727096124.78606: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11701 1727096124.78610: extending task lists for all hosts with included blocks 11701 1727096124.79083: done extending task lists 11701 1727096124.79084: done processing included files 11701 1727096124.79085: results queue empty 11701 1727096124.79085: checking for any_errors_fatal 11701 1727096124.79088: done checking for any_errors_fatal 11701 1727096124.79088: checking for max_fail_percentage 11701 1727096124.79089: done checking for max_fail_percentage 11701 1727096124.79090: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.79090: done checking to see if all hosts have failed 11701 1727096124.79090: getting the remaining hosts for this loop 11701 1727096124.79091: done getting the remaining hosts for this loop 11701 1727096124.79093: getting the next task for host managed_node3 11701 1727096124.79096: done getting next task for host managed_node3 11701 1727096124.79097: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11701 1727096124.79099: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.79106: getting variables 11701 1727096124.79107: in VariableManager get_vars() 11701 1727096124.79117: Calling all_inventory to load vars for managed_node3 11701 1727096124.79119: Calling groups_inventory to load vars for managed_node3 11701 1727096124.79120: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.79124: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.79126: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.79128: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.79232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.79346: done with get_vars() 11701 1727096124.79353: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:55:24 -0400 (0:00:00.028) 0:00:08.758 ****** 11701 1727096124.79410: entering _queue_task() for managed_node3/setup 11701 1727096124.79734: worker is 1 (out of 1 available) 11701 1727096124.79746: exiting _queue_task() for managed_node3/setup 11701 1727096124.79757: done queuing things up, now waiting for results queue to drain 11701 1727096124.79758: waiting for pending results... 11701 1727096124.80186: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11701 1727096124.80192: in run() - task 0afff68d-5257-a05c-c957-000000000188 11701 1727096124.80196: variable 'ansible_search_path' from source: unknown 11701 1727096124.80198: variable 'ansible_search_path' from source: unknown 11701 1727096124.80226: calling self._execute() 11701 1727096124.80311: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.80373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.80377: variable 'omit' from source: magic vars 11701 1727096124.80688: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.80697: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.80848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096124.82303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096124.82348: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096124.82380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096124.82572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096124.82575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096124.82578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096124.82581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096124.82590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096124.82634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096124.82657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096124.82715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096124.82743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096124.82779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096124.82821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096124.82841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096124.83011: variable '__network_required_facts' from source: role '' defaults 11701 1727096124.83023: variable 'ansible_facts' from source: unknown 11701 1727096124.83110: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11701 1727096124.83119: when evaluation is False, skipping this task 11701 1727096124.83127: _execute() done 11701 1727096124.83133: dumping result to json 11701 1727096124.83139: done dumping result, returning 11701 1727096124.83155: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-a05c-c957-000000000188] 11701 1727096124.83165: sending task result for task 0afff68d-5257-a05c-c957-000000000188 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096124.83413: no more pending results, returning what we have 11701 1727096124.83416: results queue empty 11701 1727096124.83417: checking for any_errors_fatal 11701 1727096124.83419: done checking for any_errors_fatal 11701 1727096124.83419: checking for max_fail_percentage 11701 1727096124.83421: done checking for max_fail_percentage 11701 1727096124.83422: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.83423: done checking to see if all hosts have failed 11701 1727096124.83423: getting the remaining hosts for this loop 11701 1727096124.83424: done getting the remaining hosts for this loop 11701 1727096124.83429: getting the next task for host managed_node3 11701 1727096124.83437: done getting next task for host managed_node3 11701 1727096124.83441: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11701 1727096124.83445: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.83458: getting variables 11701 1727096124.83460: in VariableManager get_vars() 11701 1727096124.83500: Calling all_inventory to load vars for managed_node3 11701 1727096124.83502: Calling groups_inventory to load vars for managed_node3 11701 1727096124.83504: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.83513: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.83515: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.83518: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.83849: done sending task result for task 0afff68d-5257-a05c-c957-000000000188 11701 1727096124.83855: WORKER PROCESS EXITING 11701 1727096124.83871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.84090: done with get_vars() 11701 1727096124.84102: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:55:24 -0400 (0:00:00.047) 0:00:08.806 ****** 11701 1727096124.84207: entering _queue_task() for managed_node3/stat 11701 1727096124.84488: worker is 1 (out of 1 available) 11701 1727096124.84501: exiting _queue_task() for managed_node3/stat 11701 1727096124.84511: done queuing things up, now waiting for results queue to drain 11701 1727096124.84512: waiting for pending results... 11701 1727096124.84776: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11701 1727096124.84917: in run() - task 0afff68d-5257-a05c-c957-00000000018a 11701 1727096124.84937: variable 'ansible_search_path' from source: unknown 11701 1727096124.84972: variable 'ansible_search_path' from source: unknown 11701 1727096124.84995: calling self._execute() 11701 1727096124.85078: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.85094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.85109: variable 'omit' from source: magic vars 11701 1727096124.85572: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.85576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.85641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096124.85985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096124.86037: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096124.86078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096124.86114: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096124.86204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096124.86238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096124.86272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096124.86303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096124.86398: variable '__network_is_ostree' from source: set_fact 11701 1727096124.86409: Evaluated conditional (not __network_is_ostree is defined): False 11701 1727096124.86452: when evaluation is False, skipping this task 11701 1727096124.86455: _execute() done 11701 1727096124.86457: dumping result to json 11701 1727096124.86459: done dumping result, returning 11701 1727096124.86466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-a05c-c957-00000000018a] 11701 1727096124.86469: sending task result for task 0afff68d-5257-a05c-c957-00000000018a skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11701 1727096124.86605: no more pending results, returning what we have 11701 1727096124.86609: results queue empty 11701 1727096124.86610: checking for any_errors_fatal 11701 1727096124.86615: done checking for any_errors_fatal 11701 1727096124.86615: checking for max_fail_percentage 11701 1727096124.86617: done checking for max_fail_percentage 11701 1727096124.86618: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.86619: done checking to see if all hosts have failed 11701 1727096124.86620: getting the remaining hosts for this loop 11701 1727096124.86622: done getting the remaining hosts for this loop 11701 1727096124.86625: getting the next task for host managed_node3 11701 1727096124.86634: done getting next task for host managed_node3 11701 1727096124.86638: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11701 1727096124.86642: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.86658: getting variables 11701 1727096124.86660: in VariableManager get_vars() 11701 1727096124.86703: Calling all_inventory to load vars for managed_node3 11701 1727096124.86706: Calling groups_inventory to load vars for managed_node3 11701 1727096124.86709: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.86720: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.86722: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.86726: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.87202: done sending task result for task 0afff68d-5257-a05c-c957-00000000018a 11701 1727096124.87206: WORKER PROCESS EXITING 11701 1727096124.87229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.87439: done with get_vars() 11701 1727096124.87453: done getting variables 11701 1727096124.87512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:55:24 -0400 (0:00:00.033) 0:00:08.840 ****** 11701 1727096124.87547: entering _queue_task() for managed_node3/set_fact 11701 1727096124.87830: worker is 1 (out of 1 available) 11701 1727096124.87843: exiting _queue_task() for managed_node3/set_fact 11701 1727096124.87859: done queuing things up, now waiting for results queue to drain 11701 1727096124.87860: waiting for pending results... 11701 1727096124.88126: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11701 1727096124.88292: in run() - task 0afff68d-5257-a05c-c957-00000000018b 11701 1727096124.88473: variable 'ansible_search_path' from source: unknown 11701 1727096124.88476: variable 'ansible_search_path' from source: unknown 11701 1727096124.88479: calling self._execute() 11701 1727096124.88481: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.88484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.88487: variable 'omit' from source: magic vars 11701 1727096124.88815: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.88836: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.89009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096124.89296: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096124.89344: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096124.89396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096124.89434: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096124.89524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096124.89557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096124.89691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096124.89695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096124.89714: variable '__network_is_ostree' from source: set_fact 11701 1727096124.89726: Evaluated conditional (not __network_is_ostree is defined): False 11701 1727096124.89733: when evaluation is False, skipping this task 11701 1727096124.89740: _execute() done 11701 1727096124.89748: dumping result to json 11701 1727096124.89759: done dumping result, returning 11701 1727096124.89774: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-a05c-c957-00000000018b] 11701 1727096124.89784: sending task result for task 0afff68d-5257-a05c-c957-00000000018b skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11701 1727096124.89954: no more pending results, returning what we have 11701 1727096124.89959: results queue empty 11701 1727096124.89960: checking for any_errors_fatal 11701 1727096124.89966: done checking for any_errors_fatal 11701 1727096124.89967: checking for max_fail_percentage 11701 1727096124.89970: done checking for max_fail_percentage 11701 1727096124.89971: checking to see if all hosts have failed and the running result is not ok 11701 1727096124.89973: done checking to see if all hosts have failed 11701 1727096124.89973: getting the remaining hosts for this loop 11701 1727096124.89975: done getting the remaining hosts for this loop 11701 1727096124.89979: getting the next task for host managed_node3 11701 1727096124.89990: done getting next task for host managed_node3 11701 1727096124.89994: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11701 1727096124.89998: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096124.90012: getting variables 11701 1727096124.90014: in VariableManager get_vars() 11701 1727096124.90058: Calling all_inventory to load vars for managed_node3 11701 1727096124.90061: Calling groups_inventory to load vars for managed_node3 11701 1727096124.90063: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096124.90075: Calling all_plugins_play to load vars for managed_node3 11701 1727096124.90077: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096124.90082: done sending task result for task 0afff68d-5257-a05c-c957-00000000018b 11701 1727096124.90084: WORKER PROCESS EXITING 11701 1727096124.90087: Calling groups_plugins_play to load vars for managed_node3 11701 1727096124.90259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096124.90411: done with get_vars() 11701 1727096124.90420: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:55:24 -0400 (0:00:00.029) 0:00:08.869 ****** 11701 1727096124.90491: entering _queue_task() for managed_node3/service_facts 11701 1727096124.90492: Creating lock for service_facts 11701 1727096124.90699: worker is 1 (out of 1 available) 11701 1727096124.90712: exiting _queue_task() for managed_node3/service_facts 11701 1727096124.90724: done queuing things up, now waiting for results queue to drain 11701 1727096124.90725: waiting for pending results... 11701 1727096124.90884: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11701 1727096124.90961: in run() - task 0afff68d-5257-a05c-c957-00000000018d 11701 1727096124.90976: variable 'ansible_search_path' from source: unknown 11701 1727096124.90979: variable 'ansible_search_path' from source: unknown 11701 1727096124.91006: calling self._execute() 11701 1727096124.91070: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.91074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.91080: variable 'omit' from source: magic vars 11701 1727096124.91329: variable 'ansible_distribution_major_version' from source: facts 11701 1727096124.91339: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096124.91345: variable 'omit' from source: magic vars 11701 1727096124.91394: variable 'omit' from source: magic vars 11701 1727096124.91418: variable 'omit' from source: magic vars 11701 1727096124.91455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096124.91479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096124.91501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096124.91512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.91521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096124.91544: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096124.91547: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.91552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.91623: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096124.91626: Set connection var ansible_timeout to 10 11701 1727096124.91629: Set connection var ansible_shell_type to sh 11701 1727096124.91634: Set connection var ansible_shell_executable to /bin/sh 11701 1727096124.91637: Set connection var ansible_connection to ssh 11701 1727096124.91644: Set connection var ansible_pipelining to False 11701 1727096124.91718: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.91722: variable 'ansible_connection' from source: unknown 11701 1727096124.91725: variable 'ansible_module_compression' from source: unknown 11701 1727096124.91727: variable 'ansible_shell_type' from source: unknown 11701 1727096124.91729: variable 'ansible_shell_executable' from source: unknown 11701 1727096124.91732: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096124.91734: variable 'ansible_pipelining' from source: unknown 11701 1727096124.91736: variable 'ansible_timeout' from source: unknown 11701 1727096124.91738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096124.91916: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096124.91941: variable 'omit' from source: magic vars 11701 1727096124.91944: starting attempt loop 11701 1727096124.91946: running the handler 11701 1727096124.91949: _low_level_execute_command(): starting 11701 1727096124.91959: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096124.92701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096124.92712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096124.92727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.92750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096124.92785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096124.92790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.92852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.92887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096124.92900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096124.92925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.93004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.94685: stdout chunk (state=3): >>>/root <<< 11701 1727096124.94778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.94811: stderr chunk (state=3): >>><<< 11701 1727096124.94814: stdout chunk (state=3): >>><<< 11701 1727096124.94836: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.94848: _low_level_execute_command(): starting 11701 1727096124.94854: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776 `" && echo ansible-tmp-1727096124.948364-12181-22061050260776="` echo /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776 `" ) && sleep 0' 11701 1727096124.95473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096124.95477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.95487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096124.95489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096124.95493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096124.95541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096124.95583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096124.97563: stdout chunk (state=3): >>>ansible-tmp-1727096124.948364-12181-22061050260776=/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776 <<< 11701 1727096124.97675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096124.97700: stderr chunk (state=3): >>><<< 11701 1727096124.97703: stdout chunk (state=3): >>><<< 11701 1727096124.97718: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096124.948364-12181-22061050260776=/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096124.97760: variable 'ansible_module_compression' from source: unknown 11701 1727096124.97802: ANSIBALLZ: Using lock for service_facts 11701 1727096124.97806: ANSIBALLZ: Acquiring lock 11701 1727096124.97808: ANSIBALLZ: Lock acquired: 139907401539584 11701 1727096124.97810: ANSIBALLZ: Creating module 11701 1727096125.09942: ANSIBALLZ: Writing module into payload 11701 1727096125.10012: ANSIBALLZ: Writing module 11701 1727096125.10034: ANSIBALLZ: Renaming module 11701 1727096125.10039: ANSIBALLZ: Done creating module 11701 1727096125.10057: variable 'ansible_facts' from source: unknown 11701 1727096125.10104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py 11701 1727096125.10210: Sending initial data 11701 1727096125.10213: Sent initial data (160 bytes) 11701 1727096125.10657: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096125.10695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096125.10699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096125.10702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096125.10704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096125.10753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096125.10756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096125.10759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096125.10808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096125.12446: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096125.12475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096125.12514: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpp6p_v2jg /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py <<< 11701 1727096125.12518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py" <<< 11701 1727096125.12543: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpp6p_v2jg" to remote "/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py" <<< 11701 1727096125.12550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py" <<< 11701 1727096125.13058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096125.13106: stderr chunk (state=3): >>><<< 11701 1727096125.13110: stdout chunk (state=3): >>><<< 11701 1727096125.13133: done transferring module to remote 11701 1727096125.13142: _low_level_execute_command(): starting 11701 1727096125.13147: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/ /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py && sleep 0' 11701 1727096125.13612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096125.13616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096125.13622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096125.13624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096125.13626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096125.13669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096125.13673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096125.13710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096125.15562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096125.15570: stderr chunk (state=3): >>><<< 11701 1727096125.15573: stdout chunk (state=3): >>><<< 11701 1727096125.15588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096125.15591: _low_level_execute_command(): starting 11701 1727096125.15596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/AnsiballZ_service_facts.py && sleep 0' 11701 1727096125.16033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096125.16038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096125.16066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096125.16071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096125.16074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096125.16076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096125.16134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096125.16137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096125.16139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096125.16183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096126.75292: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11701 1727096126.75315: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 11701 1727096126.75330: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 11701 1727096126.75345: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11701 1727096126.76910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096126.76939: stderr chunk (state=3): >>><<< 11701 1727096126.76942: stdout chunk (state=3): >>><<< 11701 1727096126.76972: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096126.78241: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096126.78249: _low_level_execute_command(): starting 11701 1727096126.78257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096124.948364-12181-22061050260776/ > /dev/null 2>&1 && sleep 0' 11701 1727096126.78725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096126.78729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096126.78731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096126.78733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096126.78736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096126.78789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096126.78792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096126.78794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096126.78836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096126.80700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096126.80721: stderr chunk (state=3): >>><<< 11701 1727096126.80724: stdout chunk (state=3): >>><<< 11701 1727096126.80737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096126.80743: handler run complete 11701 1727096126.80857: variable 'ansible_facts' from source: unknown 11701 1727096126.80955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096126.81209: variable 'ansible_facts' from source: unknown 11701 1727096126.81287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096126.81398: attempt loop complete, returning result 11701 1727096126.81402: _execute() done 11701 1727096126.81404: dumping result to json 11701 1727096126.81444: done dumping result, returning 11701 1727096126.81455: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-a05c-c957-00000000018d] 11701 1727096126.81457: sending task result for task 0afff68d-5257-a05c-c957-00000000018d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096126.82024: no more pending results, returning what we have 11701 1727096126.82026: results queue empty 11701 1727096126.82027: checking for any_errors_fatal 11701 1727096126.82030: done checking for any_errors_fatal 11701 1727096126.82031: checking for max_fail_percentage 11701 1727096126.82032: done checking for max_fail_percentage 11701 1727096126.82033: checking to see if all hosts have failed and the running result is not ok 11701 1727096126.82034: done checking to see if all hosts have failed 11701 1727096126.82034: getting the remaining hosts for this loop 11701 1727096126.82035: done getting the remaining hosts for this loop 11701 1727096126.82038: getting the next task for host managed_node3 11701 1727096126.82042: done getting next task for host managed_node3 11701 1727096126.82045: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11701 1727096126.82049: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096126.82060: getting variables 11701 1727096126.82061: in VariableManager get_vars() 11701 1727096126.82101: Calling all_inventory to load vars for managed_node3 11701 1727096126.82103: Calling groups_inventory to load vars for managed_node3 11701 1727096126.82105: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096126.82114: Calling all_plugins_play to load vars for managed_node3 11701 1727096126.82116: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096126.82119: Calling groups_plugins_play to load vars for managed_node3 11701 1727096126.82456: done sending task result for task 0afff68d-5257-a05c-c957-00000000018d 11701 1727096126.82459: WORKER PROCESS EXITING 11701 1727096126.82470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096126.82738: done with get_vars() 11701 1727096126.82749: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:55:26 -0400 (0:00:01.923) 0:00:10.792 ****** 11701 1727096126.82823: entering _queue_task() for managed_node3/package_facts 11701 1727096126.82824: Creating lock for package_facts 11701 1727096126.83049: worker is 1 (out of 1 available) 11701 1727096126.83066: exiting _queue_task() for managed_node3/package_facts 11701 1727096126.83079: done queuing things up, now waiting for results queue to drain 11701 1727096126.83080: waiting for pending results... 11701 1727096126.83231: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11701 1727096126.83318: in run() - task 0afff68d-5257-a05c-c957-00000000018e 11701 1727096126.83330: variable 'ansible_search_path' from source: unknown 11701 1727096126.83332: variable 'ansible_search_path' from source: unknown 11701 1727096126.83360: calling self._execute() 11701 1727096126.83424: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096126.83428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096126.83437: variable 'omit' from source: magic vars 11701 1727096126.83696: variable 'ansible_distribution_major_version' from source: facts 11701 1727096126.83705: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096126.83711: variable 'omit' from source: magic vars 11701 1727096126.83760: variable 'omit' from source: magic vars 11701 1727096126.83785: variable 'omit' from source: magic vars 11701 1727096126.83816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096126.83842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096126.83862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096126.83878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096126.83888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096126.83910: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096126.83913: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096126.83916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096126.83988: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096126.83992: Set connection var ansible_timeout to 10 11701 1727096126.83995: Set connection var ansible_shell_type to sh 11701 1727096126.84000: Set connection var ansible_shell_executable to /bin/sh 11701 1727096126.84002: Set connection var ansible_connection to ssh 11701 1727096126.84010: Set connection var ansible_pipelining to False 11701 1727096126.84025: variable 'ansible_shell_executable' from source: unknown 11701 1727096126.84028: variable 'ansible_connection' from source: unknown 11701 1727096126.84031: variable 'ansible_module_compression' from source: unknown 11701 1727096126.84033: variable 'ansible_shell_type' from source: unknown 11701 1727096126.84035: variable 'ansible_shell_executable' from source: unknown 11701 1727096126.84037: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096126.84041: variable 'ansible_pipelining' from source: unknown 11701 1727096126.84044: variable 'ansible_timeout' from source: unknown 11701 1727096126.84048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096126.84189: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096126.84197: variable 'omit' from source: magic vars 11701 1727096126.84201: starting attempt loop 11701 1727096126.84204: running the handler 11701 1727096126.84215: _low_level_execute_command(): starting 11701 1727096126.84222: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096126.84748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096126.84752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096126.84756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096126.84808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096126.84811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096126.84814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096126.84857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096126.86486: stdout chunk (state=3): >>>/root <<< 11701 1727096126.86585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096126.86614: stderr chunk (state=3): >>><<< 11701 1727096126.86617: stdout chunk (state=3): >>><<< 11701 1727096126.86638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096126.86648: _low_level_execute_command(): starting 11701 1727096126.86656: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299 `" && echo ansible-tmp-1727096126.8663733-12245-192562556235299="` echo /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299 `" ) && sleep 0' 11701 1727096126.87097: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096126.87100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096126.87105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096126.87116: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096126.87119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096126.87159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096126.87162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096126.87198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096126.89085: stdout chunk (state=3): >>>ansible-tmp-1727096126.8663733-12245-192562556235299=/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299 <<< 11701 1727096126.89192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096126.89219: stderr chunk (state=3): >>><<< 11701 1727096126.89222: stdout chunk (state=3): >>><<< 11701 1727096126.89236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096126.8663733-12245-192562556235299=/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096126.89282: variable 'ansible_module_compression' from source: unknown 11701 1727096126.89320: ANSIBALLZ: Using lock for package_facts 11701 1727096126.89323: ANSIBALLZ: Acquiring lock 11701 1727096126.89326: ANSIBALLZ: Lock acquired: 139907400642976 11701 1727096126.89328: ANSIBALLZ: Creating module 11701 1727096127.07040: ANSIBALLZ: Writing module into payload 11701 1727096127.07130: ANSIBALLZ: Writing module 11701 1727096127.07152: ANSIBALLZ: Renaming module 11701 1727096127.07164: ANSIBALLZ: Done creating module 11701 1727096127.07189: variable 'ansible_facts' from source: unknown 11701 1727096127.07302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py 11701 1727096127.07407: Sending initial data 11701 1727096127.07410: Sent initial data (162 bytes) 11701 1727096127.07937: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096127.07964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096127.08038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096127.09649: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096127.09701: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096127.09734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096127.09795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpb_fzs933 /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py <<< 11701 1727096127.09799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py" <<< 11701 1727096127.09843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpb_fzs933" to remote "/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py" <<< 11701 1727096127.11409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096127.11475: stderr chunk (state=3): >>><<< 11701 1727096127.11478: stdout chunk (state=3): >>><<< 11701 1727096127.11483: done transferring module to remote 11701 1727096127.11499: _low_level_execute_command(): starting 11701 1727096127.11509: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/ /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py && sleep 0' 11701 1727096127.11969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096127.11986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096127.11999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096127.12047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096127.12050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096127.12094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096127.13868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096127.13892: stderr chunk (state=3): >>><<< 11701 1727096127.13895: stdout chunk (state=3): >>><<< 11701 1727096127.13910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096127.13913: _low_level_execute_command(): starting 11701 1727096127.13917: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/AnsiballZ_package_facts.py && sleep 0' 11701 1727096127.14352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096127.14355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096127.14358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096127.14360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096127.14362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096127.14416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096127.14423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096127.14426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096127.14459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096127.58704: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11701 1727096127.58713: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11701 1727096127.58754: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11701 1727096127.58765: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11701 1727096127.58789: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11701 1727096127.58819: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11701 1727096127.58831: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11701 1727096127.58854: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 11701 1727096127.58872: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11701 1727096127.58890: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11701 1727096127.58900: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11701 1727096127.58917: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11701 1727096127.58949: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 11701 1727096127.58956: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11701 1727096127.60694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096127.60716: stderr chunk (state=3): >>><<< 11701 1727096127.60730: stdout chunk (state=3): >>><<< 11701 1727096127.60903: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096127.62327: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096127.62346: _low_level_execute_command(): starting 11701 1727096127.62350: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096126.8663733-12245-192562556235299/ > /dev/null 2>&1 && sleep 0' 11701 1727096127.62774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096127.62864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096127.62870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096127.62873: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096127.62879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096127.62882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096127.62884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096127.62905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096127.62917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096127.62987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096127.64814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096127.64876: stderr chunk (state=3): >>><<< 11701 1727096127.64885: stdout chunk (state=3): >>><<< 11701 1727096127.64902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096127.64913: handler run complete 11701 1727096127.65778: variable 'ansible_facts' from source: unknown 11701 1727096127.66239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.68321: variable 'ansible_facts' from source: unknown 11701 1727096127.68818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.69543: attempt loop complete, returning result 11701 1727096127.69566: _execute() done 11701 1727096127.69577: dumping result to json 11701 1727096127.69843: done dumping result, returning 11701 1727096127.69847: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-a05c-c957-00000000018e] 11701 1727096127.69849: sending task result for task 0afff68d-5257-a05c-c957-00000000018e 11701 1727096127.71279: done sending task result for task 0afff68d-5257-a05c-c957-00000000018e 11701 1727096127.71282: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096127.71324: no more pending results, returning what we have 11701 1727096127.71326: results queue empty 11701 1727096127.71327: checking for any_errors_fatal 11701 1727096127.71329: done checking for any_errors_fatal 11701 1727096127.71330: checking for max_fail_percentage 11701 1727096127.71331: done checking for max_fail_percentage 11701 1727096127.71332: checking to see if all hosts have failed and the running result is not ok 11701 1727096127.71332: done checking to see if all hosts have failed 11701 1727096127.71333: getting the remaining hosts for this loop 11701 1727096127.71334: done getting the remaining hosts for this loop 11701 1727096127.71336: getting the next task for host managed_node3 11701 1727096127.71341: done getting next task for host managed_node3 11701 1727096127.71343: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11701 1727096127.71345: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096127.71353: getting variables 11701 1727096127.71354: in VariableManager get_vars() 11701 1727096127.71380: Calling all_inventory to load vars for managed_node3 11701 1727096127.71382: Calling groups_inventory to load vars for managed_node3 11701 1727096127.71383: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096127.71389: Calling all_plugins_play to load vars for managed_node3 11701 1727096127.71391: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096127.71393: Calling groups_plugins_play to load vars for managed_node3 11701 1727096127.72066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.73400: done with get_vars() 11701 1727096127.73417: done getting variables 11701 1727096127.73465: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:27 -0400 (0:00:00.906) 0:00:11.699 ****** 11701 1727096127.73493: entering _queue_task() for managed_node3/debug 11701 1727096127.73715: worker is 1 (out of 1 available) 11701 1727096127.73729: exiting _queue_task() for managed_node3/debug 11701 1727096127.73739: done queuing things up, now waiting for results queue to drain 11701 1727096127.73740: waiting for pending results... 11701 1727096127.73913: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11701 1727096127.73985: in run() - task 0afff68d-5257-a05c-c957-000000000027 11701 1727096127.74000: variable 'ansible_search_path' from source: unknown 11701 1727096127.74004: variable 'ansible_search_path' from source: unknown 11701 1727096127.74030: calling self._execute() 11701 1727096127.74096: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.74101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.74110: variable 'omit' from source: magic vars 11701 1727096127.74375: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.74384: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096127.74390: variable 'omit' from source: magic vars 11701 1727096127.74423: variable 'omit' from source: magic vars 11701 1727096127.74495: variable 'network_provider' from source: set_fact 11701 1727096127.74509: variable 'omit' from source: magic vars 11701 1727096127.74546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096127.74573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096127.74589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096127.74601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096127.74614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096127.74636: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096127.74639: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.74641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.74713: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096127.74717: Set connection var ansible_timeout to 10 11701 1727096127.74721: Set connection var ansible_shell_type to sh 11701 1727096127.74725: Set connection var ansible_shell_executable to /bin/sh 11701 1727096127.74728: Set connection var ansible_connection to ssh 11701 1727096127.74735: Set connection var ansible_pipelining to False 11701 1727096127.74751: variable 'ansible_shell_executable' from source: unknown 11701 1727096127.74762: variable 'ansible_connection' from source: unknown 11701 1727096127.74765: variable 'ansible_module_compression' from source: unknown 11701 1727096127.74769: variable 'ansible_shell_type' from source: unknown 11701 1727096127.74772: variable 'ansible_shell_executable' from source: unknown 11701 1727096127.74774: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.74776: variable 'ansible_pipelining' from source: unknown 11701 1727096127.74779: variable 'ansible_timeout' from source: unknown 11701 1727096127.74781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.74880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096127.74888: variable 'omit' from source: magic vars 11701 1727096127.74893: starting attempt loop 11701 1727096127.74896: running the handler 11701 1727096127.74931: handler run complete 11701 1727096127.74943: attempt loop complete, returning result 11701 1727096127.74946: _execute() done 11701 1727096127.74949: dumping result to json 11701 1727096127.74951: done dumping result, returning 11701 1727096127.74958: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a05c-c957-000000000027] 11701 1727096127.74962: sending task result for task 0afff68d-5257-a05c-c957-000000000027 11701 1727096127.75039: done sending task result for task 0afff68d-5257-a05c-c957-000000000027 11701 1727096127.75042: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11701 1727096127.75105: no more pending results, returning what we have 11701 1727096127.75108: results queue empty 11701 1727096127.75109: checking for any_errors_fatal 11701 1727096127.75118: done checking for any_errors_fatal 11701 1727096127.75119: checking for max_fail_percentage 11701 1727096127.75120: done checking for max_fail_percentage 11701 1727096127.75121: checking to see if all hosts have failed and the running result is not ok 11701 1727096127.75122: done checking to see if all hosts have failed 11701 1727096127.75123: getting the remaining hosts for this loop 11701 1727096127.75124: done getting the remaining hosts for this loop 11701 1727096127.75127: getting the next task for host managed_node3 11701 1727096127.75134: done getting next task for host managed_node3 11701 1727096127.75138: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11701 1727096127.75141: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096127.75150: getting variables 11701 1727096127.75152: in VariableManager get_vars() 11701 1727096127.75193: Calling all_inventory to load vars for managed_node3 11701 1727096127.75195: Calling groups_inventory to load vars for managed_node3 11701 1727096127.75197: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096127.75205: Calling all_plugins_play to load vars for managed_node3 11701 1727096127.75207: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096127.75209: Calling groups_plugins_play to load vars for managed_node3 11701 1727096127.76556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.77542: done with get_vars() 11701 1727096127.77561: done getting variables 11701 1727096127.77627: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:27 -0400 (0:00:00.041) 0:00:11.741 ****** 11701 1727096127.77655: entering _queue_task() for managed_node3/fail 11701 1727096127.77656: Creating lock for fail 11701 1727096127.77887: worker is 1 (out of 1 available) 11701 1727096127.77902: exiting _queue_task() for managed_node3/fail 11701 1727096127.77913: done queuing things up, now waiting for results queue to drain 11701 1727096127.77915: waiting for pending results... 11701 1727096127.78085: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11701 1727096127.78165: in run() - task 0afff68d-5257-a05c-c957-000000000028 11701 1727096127.78178: variable 'ansible_search_path' from source: unknown 11701 1727096127.78181: variable 'ansible_search_path' from source: unknown 11701 1727096127.78213: calling self._execute() 11701 1727096127.78280: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.78284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.78292: variable 'omit' from source: magic vars 11701 1727096127.78610: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.78623: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096127.78873: variable 'network_state' from source: role '' defaults 11701 1727096127.78877: Evaluated conditional (network_state != {}): False 11701 1727096127.78880: when evaluation is False, skipping this task 11701 1727096127.78882: _execute() done 11701 1727096127.78884: dumping result to json 11701 1727096127.78886: done dumping result, returning 11701 1727096127.78889: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a05c-c957-000000000028] 11701 1727096127.78892: sending task result for task 0afff68d-5257-a05c-c957-000000000028 11701 1727096127.78965: done sending task result for task 0afff68d-5257-a05c-c957-000000000028 11701 1727096127.78972: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096127.79018: no more pending results, returning what we have 11701 1727096127.79022: results queue empty 11701 1727096127.79024: checking for any_errors_fatal 11701 1727096127.79033: done checking for any_errors_fatal 11701 1727096127.79034: checking for max_fail_percentage 11701 1727096127.79036: done checking for max_fail_percentage 11701 1727096127.79037: checking to see if all hosts have failed and the running result is not ok 11701 1727096127.79038: done checking to see if all hosts have failed 11701 1727096127.79039: getting the remaining hosts for this loop 11701 1727096127.79040: done getting the remaining hosts for this loop 11701 1727096127.79043: getting the next task for host managed_node3 11701 1727096127.79052: done getting next task for host managed_node3 11701 1727096127.79056: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11701 1727096127.79059: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096127.79260: getting variables 11701 1727096127.79262: in VariableManager get_vars() 11701 1727096127.79300: Calling all_inventory to load vars for managed_node3 11701 1727096127.79303: Calling groups_inventory to load vars for managed_node3 11701 1727096127.79305: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096127.79314: Calling all_plugins_play to load vars for managed_node3 11701 1727096127.79316: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096127.79319: Calling groups_plugins_play to load vars for managed_node3 11701 1727096127.80338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.81705: done with get_vars() 11701 1727096127.81722: done getting variables 11701 1727096127.81771: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:27 -0400 (0:00:00.041) 0:00:11.782 ****** 11701 1727096127.81796: entering _queue_task() for managed_node3/fail 11701 1727096127.82037: worker is 1 (out of 1 available) 11701 1727096127.82049: exiting _queue_task() for managed_node3/fail 11701 1727096127.82060: done queuing things up, now waiting for results queue to drain 11701 1727096127.82061: waiting for pending results... 11701 1727096127.82237: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11701 1727096127.82318: in run() - task 0afff68d-5257-a05c-c957-000000000029 11701 1727096127.82330: variable 'ansible_search_path' from source: unknown 11701 1727096127.82335: variable 'ansible_search_path' from source: unknown 11701 1727096127.82366: calling self._execute() 11701 1727096127.82441: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.82446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.82458: variable 'omit' from source: magic vars 11701 1727096127.82724: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.82735: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096127.82818: variable 'network_state' from source: role '' defaults 11701 1727096127.82827: Evaluated conditional (network_state != {}): False 11701 1727096127.82831: when evaluation is False, skipping this task 11701 1727096127.82833: _execute() done 11701 1727096127.82835: dumping result to json 11701 1727096127.82843: done dumping result, returning 11701 1727096127.82853: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a05c-c957-000000000029] 11701 1727096127.82856: sending task result for task 0afff68d-5257-a05c-c957-000000000029 11701 1727096127.82940: done sending task result for task 0afff68d-5257-a05c-c957-000000000029 11701 1727096127.82943: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096127.82991: no more pending results, returning what we have 11701 1727096127.82995: results queue empty 11701 1727096127.82995: checking for any_errors_fatal 11701 1727096127.83002: done checking for any_errors_fatal 11701 1727096127.83002: checking for max_fail_percentage 11701 1727096127.83004: done checking for max_fail_percentage 11701 1727096127.83005: checking to see if all hosts have failed and the running result is not ok 11701 1727096127.83006: done checking to see if all hosts have failed 11701 1727096127.83007: getting the remaining hosts for this loop 11701 1727096127.83008: done getting the remaining hosts for this loop 11701 1727096127.83011: getting the next task for host managed_node3 11701 1727096127.83018: done getting next task for host managed_node3 11701 1727096127.83021: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11701 1727096127.83024: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096127.83040: getting variables 11701 1727096127.83042: in VariableManager get_vars() 11701 1727096127.83079: Calling all_inventory to load vars for managed_node3 11701 1727096127.83085: Calling groups_inventory to load vars for managed_node3 11701 1727096127.83087: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096127.83096: Calling all_plugins_play to load vars for managed_node3 11701 1727096127.83098: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096127.83100: Calling groups_plugins_play to load vars for managed_node3 11701 1727096127.84324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.85224: done with get_vars() 11701 1727096127.85245: done getting variables 11701 1727096127.85291: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:27 -0400 (0:00:00.035) 0:00:11.817 ****** 11701 1727096127.85316: entering _queue_task() for managed_node3/fail 11701 1727096127.85555: worker is 1 (out of 1 available) 11701 1727096127.85570: exiting _queue_task() for managed_node3/fail 11701 1727096127.85581: done queuing things up, now waiting for results queue to drain 11701 1727096127.85583: waiting for pending results... 11701 1727096127.85755: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11701 1727096127.85840: in run() - task 0afff68d-5257-a05c-c957-00000000002a 11701 1727096127.85852: variable 'ansible_search_path' from source: unknown 11701 1727096127.85855: variable 'ansible_search_path' from source: unknown 11701 1727096127.85893: calling self._execute() 11701 1727096127.85959: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.85963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.85973: variable 'omit' from source: magic vars 11701 1727096127.86376: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.86380: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096127.86673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096127.88219: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096127.88270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096127.88298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096127.88325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096127.88346: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096127.88413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.88434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.88473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.88483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.88494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.88562: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.88587: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11701 1727096127.88663: variable 'ansible_distribution' from source: facts 11701 1727096127.88668: variable '__network_rh_distros' from source: role '' defaults 11701 1727096127.88671: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11701 1727096127.88826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.88842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.88860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.88891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.88902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.88934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.88953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.88970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.88999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.89009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.89037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.89055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.89071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.89100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.89111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.89572: variable 'network_connections' from source: task vars 11701 1727096127.89575: variable 'controller_profile' from source: play vars 11701 1727096127.89578: variable 'controller_profile' from source: play vars 11701 1727096127.89580: variable 'controller_device' from source: play vars 11701 1727096127.89583: variable 'controller_device' from source: play vars 11701 1727096127.89586: variable 'port1_profile' from source: play vars 11701 1727096127.89588: variable 'port1_profile' from source: play vars 11701 1727096127.89602: variable 'dhcp_interface1' from source: play vars 11701 1727096127.89661: variable 'dhcp_interface1' from source: play vars 11701 1727096127.89674: variable 'controller_profile' from source: play vars 11701 1727096127.89726: variable 'controller_profile' from source: play vars 11701 1727096127.89736: variable 'port2_profile' from source: play vars 11701 1727096127.89795: variable 'port2_profile' from source: play vars 11701 1727096127.89808: variable 'dhcp_interface2' from source: play vars 11701 1727096127.89872: variable 'dhcp_interface2' from source: play vars 11701 1727096127.89883: variable 'controller_profile' from source: play vars 11701 1727096127.90254: variable 'controller_profile' from source: play vars 11701 1727096127.90270: variable 'network_state' from source: role '' defaults 11701 1727096127.90338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096127.90506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096127.90547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096127.90584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096127.90617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096127.90663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096127.90692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096127.90721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.90752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096127.90796: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11701 1727096127.90804: when evaluation is False, skipping this task 11701 1727096127.90809: _execute() done 11701 1727096127.90815: dumping result to json 11701 1727096127.90820: done dumping result, returning 11701 1727096127.90830: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a05c-c957-00000000002a] 11701 1727096127.90838: sending task result for task 0afff68d-5257-a05c-c957-00000000002a skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11701 1727096127.90980: no more pending results, returning what we have 11701 1727096127.90984: results queue empty 11701 1727096127.90985: checking for any_errors_fatal 11701 1727096127.90992: done checking for any_errors_fatal 11701 1727096127.90993: checking for max_fail_percentage 11701 1727096127.90995: done checking for max_fail_percentage 11701 1727096127.90996: checking to see if all hosts have failed and the running result is not ok 11701 1727096127.90997: done checking to see if all hosts have failed 11701 1727096127.90998: getting the remaining hosts for this loop 11701 1727096127.90999: done getting the remaining hosts for this loop 11701 1727096127.91074: getting the next task for host managed_node3 11701 1727096127.91082: done getting next task for host managed_node3 11701 1727096127.91087: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11701 1727096127.91090: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096127.91106: getting variables 11701 1727096127.91108: in VariableManager get_vars() 11701 1727096127.91164: Calling all_inventory to load vars for managed_node3 11701 1727096127.91178: Calling groups_inventory to load vars for managed_node3 11701 1727096127.91181: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096127.91187: done sending task result for task 0afff68d-5257-a05c-c957-00000000002a 11701 1727096127.91189: WORKER PROCESS EXITING 11701 1727096127.91198: Calling all_plugins_play to load vars for managed_node3 11701 1727096127.91201: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096127.91203: Calling groups_plugins_play to load vars for managed_node3 11701 1727096127.92807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096127.94418: done with get_vars() 11701 1727096127.94444: done getting variables 11701 1727096127.94560: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:27 -0400 (0:00:00.092) 0:00:11.910 ****** 11701 1727096127.94601: entering _queue_task() for managed_node3/dnf 11701 1727096127.94940: worker is 1 (out of 1 available) 11701 1727096127.94956: exiting _queue_task() for managed_node3/dnf 11701 1727096127.95172: done queuing things up, now waiting for results queue to drain 11701 1727096127.95174: waiting for pending results... 11701 1727096127.95262: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11701 1727096127.95406: in run() - task 0afff68d-5257-a05c-c957-00000000002b 11701 1727096127.95430: variable 'ansible_search_path' from source: unknown 11701 1727096127.95439: variable 'ansible_search_path' from source: unknown 11701 1727096127.95483: calling self._execute() 11701 1727096127.95578: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096127.95592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096127.95605: variable 'omit' from source: magic vars 11701 1727096127.95992: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.96008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096127.96221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096127.98090: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096127.98138: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096127.98169: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096127.98193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096127.98216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096127.98277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.98297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.98320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.98344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.98357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.98441: variable 'ansible_distribution' from source: facts 11701 1727096127.98445: variable 'ansible_distribution_major_version' from source: facts 11701 1727096127.98459: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11701 1727096127.98540: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096127.98620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.98639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.98660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.98687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.98698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.98725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.98741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.98762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.98788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.98798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.98824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096127.98840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096127.98861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096127.98892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096127.98903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096127.99095: variable 'network_connections' from source: task vars 11701 1727096127.99098: variable 'controller_profile' from source: play vars 11701 1727096127.99137: variable 'controller_profile' from source: play vars 11701 1727096127.99140: variable 'controller_device' from source: play vars 11701 1727096127.99264: variable 'controller_device' from source: play vars 11701 1727096127.99269: variable 'port1_profile' from source: play vars 11701 1727096127.99272: variable 'port1_profile' from source: play vars 11701 1727096127.99275: variable 'dhcp_interface1' from source: play vars 11701 1727096127.99473: variable 'dhcp_interface1' from source: play vars 11701 1727096127.99477: variable 'controller_profile' from source: play vars 11701 1727096127.99479: variable 'controller_profile' from source: play vars 11701 1727096127.99482: variable 'port2_profile' from source: play vars 11701 1727096127.99484: variable 'port2_profile' from source: play vars 11701 1727096127.99486: variable 'dhcp_interface2' from source: play vars 11701 1727096127.99514: variable 'dhcp_interface2' from source: play vars 11701 1727096127.99520: variable 'controller_profile' from source: play vars 11701 1727096127.99581: variable 'controller_profile' from source: play vars 11701 1727096127.99668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096127.99830: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096127.99908: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096127.99911: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096127.99932: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096127.99974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096127.99996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096128.00021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.00046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096128.00108: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096128.00319: variable 'network_connections' from source: task vars 11701 1727096128.00323: variable 'controller_profile' from source: play vars 11701 1727096128.00375: variable 'controller_profile' from source: play vars 11701 1727096128.00382: variable 'controller_device' from source: play vars 11701 1727096128.00423: variable 'controller_device' from source: play vars 11701 1727096128.00430: variable 'port1_profile' from source: play vars 11701 1727096128.00476: variable 'port1_profile' from source: play vars 11701 1727096128.00482: variable 'dhcp_interface1' from source: play vars 11701 1727096128.00525: variable 'dhcp_interface1' from source: play vars 11701 1727096128.00529: variable 'controller_profile' from source: play vars 11701 1727096128.00581: variable 'controller_profile' from source: play vars 11701 1727096128.00598: variable 'port2_profile' from source: play vars 11701 1727096128.00641: variable 'port2_profile' from source: play vars 11701 1727096128.00648: variable 'dhcp_interface2' from source: play vars 11701 1727096128.00696: variable 'dhcp_interface2' from source: play vars 11701 1727096128.00704: variable 'controller_profile' from source: play vars 11701 1727096128.00743: variable 'controller_profile' from source: play vars 11701 1727096128.00771: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096128.00780: when evaluation is False, skipping this task 11701 1727096128.00783: _execute() done 11701 1727096128.00786: dumping result to json 11701 1727096128.00788: done dumping result, returning 11701 1727096128.00797: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a05c-c957-00000000002b] 11701 1727096128.00803: sending task result for task 0afff68d-5257-a05c-c957-00000000002b 11701 1727096128.00899: done sending task result for task 0afff68d-5257-a05c-c957-00000000002b 11701 1727096128.00901: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096128.00969: no more pending results, returning what we have 11701 1727096128.00974: results queue empty 11701 1727096128.00975: checking for any_errors_fatal 11701 1727096128.00981: done checking for any_errors_fatal 11701 1727096128.00981: checking for max_fail_percentage 11701 1727096128.00983: done checking for max_fail_percentage 11701 1727096128.00984: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.00985: done checking to see if all hosts have failed 11701 1727096128.00985: getting the remaining hosts for this loop 11701 1727096128.00987: done getting the remaining hosts for this loop 11701 1727096128.00991: getting the next task for host managed_node3 11701 1727096128.00996: done getting next task for host managed_node3 11701 1727096128.01000: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11701 1727096128.01002: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.01018: getting variables 11701 1727096128.01019: in VariableManager get_vars() 11701 1727096128.01062: Calling all_inventory to load vars for managed_node3 11701 1727096128.01065: Calling groups_inventory to load vars for managed_node3 11701 1727096128.01069: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.01078: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.01080: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.01083: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.01879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.03171: done with get_vars() 11701 1727096128.03199: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11701 1727096128.03287: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:28 -0400 (0:00:00.087) 0:00:11.997 ****** 11701 1727096128.03319: entering _queue_task() for managed_node3/yum 11701 1727096128.03321: Creating lock for yum 11701 1727096128.03718: worker is 1 (out of 1 available) 11701 1727096128.03731: exiting _queue_task() for managed_node3/yum 11701 1727096128.03743: done queuing things up, now waiting for results queue to drain 11701 1727096128.03745: waiting for pending results... 11701 1727096128.03956: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11701 1727096128.04043: in run() - task 0afff68d-5257-a05c-c957-00000000002c 11701 1727096128.04059: variable 'ansible_search_path' from source: unknown 11701 1727096128.04062: variable 'ansible_search_path' from source: unknown 11701 1727096128.04093: calling self._execute() 11701 1727096128.04159: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.04163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.04173: variable 'omit' from source: magic vars 11701 1727096128.04452: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.04463: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.04591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096128.06849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096128.06974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096128.06978: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096128.06981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096128.07004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096128.07087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.07117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.07145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.07182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.07196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.07292: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.07306: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11701 1727096128.07309: when evaluation is False, skipping this task 11701 1727096128.07312: _execute() done 11701 1727096128.07315: dumping result to json 11701 1727096128.07318: done dumping result, returning 11701 1727096128.07326: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a05c-c957-00000000002c] 11701 1727096128.07331: sending task result for task 0afff68d-5257-a05c-c957-00000000002c 11701 1727096128.07430: done sending task result for task 0afff68d-5257-a05c-c957-00000000002c 11701 1727096128.07433: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11701 1727096128.07513: no more pending results, returning what we have 11701 1727096128.07516: results queue empty 11701 1727096128.07517: checking for any_errors_fatal 11701 1727096128.07522: done checking for any_errors_fatal 11701 1727096128.07523: checking for max_fail_percentage 11701 1727096128.07524: done checking for max_fail_percentage 11701 1727096128.07525: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.07526: done checking to see if all hosts have failed 11701 1727096128.07527: getting the remaining hosts for this loop 11701 1727096128.07528: done getting the remaining hosts for this loop 11701 1727096128.07532: getting the next task for host managed_node3 11701 1727096128.07537: done getting next task for host managed_node3 11701 1727096128.07541: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11701 1727096128.07544: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.07558: getting variables 11701 1727096128.07559: in VariableManager get_vars() 11701 1727096128.07601: Calling all_inventory to load vars for managed_node3 11701 1727096128.07604: Calling groups_inventory to load vars for managed_node3 11701 1727096128.07606: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.07615: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.07618: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.07620: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.08562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.09413: done with get_vars() 11701 1727096128.09432: done getting variables 11701 1727096128.09478: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:28 -0400 (0:00:00.061) 0:00:12.059 ****** 11701 1727096128.09502: entering _queue_task() for managed_node3/fail 11701 1727096128.09801: worker is 1 (out of 1 available) 11701 1727096128.09814: exiting _queue_task() for managed_node3/fail 11701 1727096128.09828: done queuing things up, now waiting for results queue to drain 11701 1727096128.09829: waiting for pending results... 11701 1727096128.10188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11701 1727096128.10250: in run() - task 0afff68d-5257-a05c-c957-00000000002d 11701 1727096128.10274: variable 'ansible_search_path' from source: unknown 11701 1727096128.10304: variable 'ansible_search_path' from source: unknown 11701 1727096128.10327: calling self._execute() 11701 1727096128.10418: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.10474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.10478: variable 'omit' from source: magic vars 11701 1727096128.10797: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.10808: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.10891: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.11031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096128.13375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096128.13380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096128.13383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096128.13385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096128.13387: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096128.13464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.13513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.13547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.13598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.13632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.13693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.13735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.13773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.13819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.13946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.13953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.13956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.13975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.14020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.14041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.14277: variable 'network_connections' from source: task vars 11701 1727096128.14297: variable 'controller_profile' from source: play vars 11701 1727096128.14388: variable 'controller_profile' from source: play vars 11701 1727096128.14405: variable 'controller_device' from source: play vars 11701 1727096128.14483: variable 'controller_device' from source: play vars 11701 1727096128.14509: variable 'port1_profile' from source: play vars 11701 1727096128.14579: variable 'port1_profile' from source: play vars 11701 1727096128.14605: variable 'dhcp_interface1' from source: play vars 11701 1727096128.14706: variable 'dhcp_interface1' from source: play vars 11701 1727096128.14713: variable 'controller_profile' from source: play vars 11701 1727096128.14762: variable 'controller_profile' from source: play vars 11701 1727096128.14777: variable 'port2_profile' from source: play vars 11701 1727096128.14854: variable 'port2_profile' from source: play vars 11701 1727096128.14925: variable 'dhcp_interface2' from source: play vars 11701 1727096128.14948: variable 'dhcp_interface2' from source: play vars 11701 1727096128.14965: variable 'controller_profile' from source: play vars 11701 1727096128.15041: variable 'controller_profile' from source: play vars 11701 1727096128.15126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096128.15380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096128.15424: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096128.15577: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096128.15581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096128.15583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096128.15600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096128.15633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.15671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096128.15770: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096128.16122: variable 'network_connections' from source: task vars 11701 1727096128.16125: variable 'controller_profile' from source: play vars 11701 1727096128.16163: variable 'controller_profile' from source: play vars 11701 1727096128.16177: variable 'controller_device' from source: play vars 11701 1727096128.16249: variable 'controller_device' from source: play vars 11701 1727096128.16266: variable 'port1_profile' from source: play vars 11701 1727096128.16332: variable 'port1_profile' from source: play vars 11701 1727096128.16341: variable 'dhcp_interface1' from source: play vars 11701 1727096128.16387: variable 'dhcp_interface1' from source: play vars 11701 1727096128.16392: variable 'controller_profile' from source: play vars 11701 1727096128.16434: variable 'controller_profile' from source: play vars 11701 1727096128.16440: variable 'port2_profile' from source: play vars 11701 1727096128.16486: variable 'port2_profile' from source: play vars 11701 1727096128.16492: variable 'dhcp_interface2' from source: play vars 11701 1727096128.16532: variable 'dhcp_interface2' from source: play vars 11701 1727096128.16538: variable 'controller_profile' from source: play vars 11701 1727096128.16586: variable 'controller_profile' from source: play vars 11701 1727096128.16610: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096128.16614: when evaluation is False, skipping this task 11701 1727096128.16616: _execute() done 11701 1727096128.16619: dumping result to json 11701 1727096128.16621: done dumping result, returning 11701 1727096128.16630: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a05c-c957-00000000002d] 11701 1727096128.16634: sending task result for task 0afff68d-5257-a05c-c957-00000000002d 11701 1727096128.16722: done sending task result for task 0afff68d-5257-a05c-c957-00000000002d 11701 1727096128.16725: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096128.16780: no more pending results, returning what we have 11701 1727096128.16784: results queue empty 11701 1727096128.16785: checking for any_errors_fatal 11701 1727096128.16789: done checking for any_errors_fatal 11701 1727096128.16790: checking for max_fail_percentage 11701 1727096128.16791: done checking for max_fail_percentage 11701 1727096128.16792: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.16793: done checking to see if all hosts have failed 11701 1727096128.16794: getting the remaining hosts for this loop 11701 1727096128.16795: done getting the remaining hosts for this loop 11701 1727096128.16798: getting the next task for host managed_node3 11701 1727096128.16805: done getting next task for host managed_node3 11701 1727096128.16808: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11701 1727096128.16811: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.16824: getting variables 11701 1727096128.16825: in VariableManager get_vars() 11701 1727096128.16877: Calling all_inventory to load vars for managed_node3 11701 1727096128.16880: Calling groups_inventory to load vars for managed_node3 11701 1727096128.16882: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.16892: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.16895: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.16898: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.17693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.18655: done with get_vars() 11701 1727096128.18675: done getting variables 11701 1727096128.18722: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:28 -0400 (0:00:00.092) 0:00:12.152 ****** 11701 1727096128.18746: entering _queue_task() for managed_node3/package 11701 1727096128.19003: worker is 1 (out of 1 available) 11701 1727096128.19016: exiting _queue_task() for managed_node3/package 11701 1727096128.19029: done queuing things up, now waiting for results queue to drain 11701 1727096128.19031: waiting for pending results... 11701 1727096128.19198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11701 1727096128.19276: in run() - task 0afff68d-5257-a05c-c957-00000000002e 11701 1727096128.19289: variable 'ansible_search_path' from source: unknown 11701 1727096128.19292: variable 'ansible_search_path' from source: unknown 11701 1727096128.19321: calling self._execute() 11701 1727096128.19388: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.19394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.19402: variable 'omit' from source: magic vars 11701 1727096128.19667: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.19678: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.19815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096128.20010: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096128.20045: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096128.20071: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096128.20097: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096128.20178: variable 'network_packages' from source: role '' defaults 11701 1727096128.20249: variable '__network_provider_setup' from source: role '' defaults 11701 1727096128.20259: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096128.20306: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096128.20313: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096128.20358: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096128.20474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096128.21799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096128.21847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096128.21878: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096128.21902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096128.21921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096128.21985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.22004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.22022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.22048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.22060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.22096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.22113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.22129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.22155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.22164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.22311: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11701 1727096128.22387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.22405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.22424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.22448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.22459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.22529: variable 'ansible_python' from source: facts 11701 1727096128.22543: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11701 1727096128.22603: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096128.22661: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096128.22742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.22769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.22787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.22811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.22822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.22859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.22882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.22900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.22924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.22934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.23037: variable 'network_connections' from source: task vars 11701 1727096128.23040: variable 'controller_profile' from source: play vars 11701 1727096128.23116: variable 'controller_profile' from source: play vars 11701 1727096128.23125: variable 'controller_device' from source: play vars 11701 1727096128.23198: variable 'controller_device' from source: play vars 11701 1727096128.23208: variable 'port1_profile' from source: play vars 11701 1727096128.23277: variable 'port1_profile' from source: play vars 11701 1727096128.23287: variable 'dhcp_interface1' from source: play vars 11701 1727096128.23353: variable 'dhcp_interface1' from source: play vars 11701 1727096128.23363: variable 'controller_profile' from source: play vars 11701 1727096128.23434: variable 'controller_profile' from source: play vars 11701 1727096128.23442: variable 'port2_profile' from source: play vars 11701 1727096128.23517: variable 'port2_profile' from source: play vars 11701 1727096128.23521: variable 'dhcp_interface2' from source: play vars 11701 1727096128.23590: variable 'dhcp_interface2' from source: play vars 11701 1727096128.23598: variable 'controller_profile' from source: play vars 11701 1727096128.23671: variable 'controller_profile' from source: play vars 11701 1727096128.23727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096128.23749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096128.23774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.23796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096128.23834: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.24017: variable 'network_connections' from source: task vars 11701 1727096128.24021: variable 'controller_profile' from source: play vars 11701 1727096128.24095: variable 'controller_profile' from source: play vars 11701 1727096128.24102: variable 'controller_device' from source: play vars 11701 1727096128.24178: variable 'controller_device' from source: play vars 11701 1727096128.24185: variable 'port1_profile' from source: play vars 11701 1727096128.24250: variable 'port1_profile' from source: play vars 11701 1727096128.24261: variable 'dhcp_interface1' from source: play vars 11701 1727096128.24333: variable 'dhcp_interface1' from source: play vars 11701 1727096128.24340: variable 'controller_profile' from source: play vars 11701 1727096128.24413: variable 'controller_profile' from source: play vars 11701 1727096128.24420: variable 'port2_profile' from source: play vars 11701 1727096128.24491: variable 'port2_profile' from source: play vars 11701 1727096128.24506: variable 'dhcp_interface2' from source: play vars 11701 1727096128.24570: variable 'dhcp_interface2' from source: play vars 11701 1727096128.24577: variable 'controller_profile' from source: play vars 11701 1727096128.24646: variable 'controller_profile' from source: play vars 11701 1727096128.24691: variable '__network_packages_default_wireless' from source: role '' defaults 11701 1727096128.24747: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.24962: variable 'network_connections' from source: task vars 11701 1727096128.24965: variable 'controller_profile' from source: play vars 11701 1727096128.25012: variable 'controller_profile' from source: play vars 11701 1727096128.25018: variable 'controller_device' from source: play vars 11701 1727096128.25070: variable 'controller_device' from source: play vars 11701 1727096128.25078: variable 'port1_profile' from source: play vars 11701 1727096128.25122: variable 'port1_profile' from source: play vars 11701 1727096128.25128: variable 'dhcp_interface1' from source: play vars 11701 1727096128.25178: variable 'dhcp_interface1' from source: play vars 11701 1727096128.25183: variable 'controller_profile' from source: play vars 11701 1727096128.25227: variable 'controller_profile' from source: play vars 11701 1727096128.25233: variable 'port2_profile' from source: play vars 11701 1727096128.25284: variable 'port2_profile' from source: play vars 11701 1727096128.25290: variable 'dhcp_interface2' from source: play vars 11701 1727096128.25334: variable 'dhcp_interface2' from source: play vars 11701 1727096128.25339: variable 'controller_profile' from source: play vars 11701 1727096128.25389: variable 'controller_profile' from source: play vars 11701 1727096128.25410: variable '__network_packages_default_team' from source: role '' defaults 11701 1727096128.25469: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096128.25666: variable 'network_connections' from source: task vars 11701 1727096128.25672: variable 'controller_profile' from source: play vars 11701 1727096128.25719: variable 'controller_profile' from source: play vars 11701 1727096128.25725: variable 'controller_device' from source: play vars 11701 1727096128.25772: variable 'controller_device' from source: play vars 11701 1727096128.25780: variable 'port1_profile' from source: play vars 11701 1727096128.25828: variable 'port1_profile' from source: play vars 11701 1727096128.25834: variable 'dhcp_interface1' from source: play vars 11701 1727096128.25882: variable 'dhcp_interface1' from source: play vars 11701 1727096128.25887: variable 'controller_profile' from source: play vars 11701 1727096128.25936: variable 'controller_profile' from source: play vars 11701 1727096128.25942: variable 'port2_profile' from source: play vars 11701 1727096128.25991: variable 'port2_profile' from source: play vars 11701 1727096128.25996: variable 'dhcp_interface2' from source: play vars 11701 1727096128.26043: variable 'dhcp_interface2' from source: play vars 11701 1727096128.26049: variable 'controller_profile' from source: play vars 11701 1727096128.26096: variable 'controller_profile' from source: play vars 11701 1727096128.26144: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096128.26189: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096128.26194: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096128.26236: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096128.26376: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11701 1727096128.26696: variable 'network_connections' from source: task vars 11701 1727096128.26700: variable 'controller_profile' from source: play vars 11701 1727096128.26741: variable 'controller_profile' from source: play vars 11701 1727096128.26750: variable 'controller_device' from source: play vars 11701 1727096128.26797: variable 'controller_device' from source: play vars 11701 1727096128.26803: variable 'port1_profile' from source: play vars 11701 1727096128.26843: variable 'port1_profile' from source: play vars 11701 1727096128.26848: variable 'dhcp_interface1' from source: play vars 11701 1727096128.26893: variable 'dhcp_interface1' from source: play vars 11701 1727096128.26898: variable 'controller_profile' from source: play vars 11701 1727096128.26941: variable 'controller_profile' from source: play vars 11701 1727096128.26947: variable 'port2_profile' from source: play vars 11701 1727096128.26991: variable 'port2_profile' from source: play vars 11701 1727096128.26997: variable 'dhcp_interface2' from source: play vars 11701 1727096128.27041: variable 'dhcp_interface2' from source: play vars 11701 1727096128.27046: variable 'controller_profile' from source: play vars 11701 1727096128.27091: variable 'controller_profile' from source: play vars 11701 1727096128.27098: variable 'ansible_distribution' from source: facts 11701 1727096128.27100: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.27106: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.27129: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11701 1727096128.27236: variable 'ansible_distribution' from source: facts 11701 1727096128.27240: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.27242: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.27257: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11701 1727096128.27364: variable 'ansible_distribution' from source: facts 11701 1727096128.27369: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.27373: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.27400: variable 'network_provider' from source: set_fact 11701 1727096128.27411: variable 'ansible_facts' from source: unknown 11701 1727096128.27866: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11701 1727096128.27871: when evaluation is False, skipping this task 11701 1727096128.27874: _execute() done 11701 1727096128.28022: dumping result to json 11701 1727096128.28025: done dumping result, returning 11701 1727096128.28028: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a05c-c957-00000000002e] 11701 1727096128.28030: sending task result for task 0afff68d-5257-a05c-c957-00000000002e 11701 1727096128.28107: done sending task result for task 0afff68d-5257-a05c-c957-00000000002e 11701 1727096128.28111: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11701 1727096128.28160: no more pending results, returning what we have 11701 1727096128.28164: results queue empty 11701 1727096128.28165: checking for any_errors_fatal 11701 1727096128.28276: done checking for any_errors_fatal 11701 1727096128.28277: checking for max_fail_percentage 11701 1727096128.28279: done checking for max_fail_percentage 11701 1727096128.28280: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.28281: done checking to see if all hosts have failed 11701 1727096128.28282: getting the remaining hosts for this loop 11701 1727096128.28283: done getting the remaining hosts for this loop 11701 1727096128.28287: getting the next task for host managed_node3 11701 1727096128.28293: done getting next task for host managed_node3 11701 1727096128.28297: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11701 1727096128.28300: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.28314: getting variables 11701 1727096128.28315: in VariableManager get_vars() 11701 1727096128.28351: Calling all_inventory to load vars for managed_node3 11701 1727096128.28354: Calling groups_inventory to load vars for managed_node3 11701 1727096128.28356: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.28364: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.28366: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.28437: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.29819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.30688: done with get_vars() 11701 1727096128.30710: done getting variables 11701 1727096128.30759: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:28 -0400 (0:00:00.120) 0:00:12.272 ****** 11701 1727096128.30787: entering _queue_task() for managed_node3/package 11701 1727096128.31076: worker is 1 (out of 1 available) 11701 1727096128.31089: exiting _queue_task() for managed_node3/package 11701 1727096128.31102: done queuing things up, now waiting for results queue to drain 11701 1727096128.31103: waiting for pending results... 11701 1727096128.31477: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11701 1727096128.31549: in run() - task 0afff68d-5257-a05c-c957-00000000002f 11701 1727096128.31574: variable 'ansible_search_path' from source: unknown 11701 1727096128.31584: variable 'ansible_search_path' from source: unknown 11701 1727096128.31633: calling self._execute() 11701 1727096128.31735: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.31748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.31763: variable 'omit' from source: magic vars 11701 1727096128.32149: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.32246: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.32294: variable 'network_state' from source: role '' defaults 11701 1727096128.32307: Evaluated conditional (network_state != {}): False 11701 1727096128.32314: when evaluation is False, skipping this task 11701 1727096128.32319: _execute() done 11701 1727096128.32324: dumping result to json 11701 1727096128.32329: done dumping result, returning 11701 1727096128.32339: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a05c-c957-00000000002f] 11701 1727096128.32346: sending task result for task 0afff68d-5257-a05c-c957-00000000002f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096128.32515: no more pending results, returning what we have 11701 1727096128.32519: results queue empty 11701 1727096128.32520: checking for any_errors_fatal 11701 1727096128.32525: done checking for any_errors_fatal 11701 1727096128.32526: checking for max_fail_percentage 11701 1727096128.32528: done checking for max_fail_percentage 11701 1727096128.32529: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.32530: done checking to see if all hosts have failed 11701 1727096128.32531: getting the remaining hosts for this loop 11701 1727096128.32532: done getting the remaining hosts for this loop 11701 1727096128.32536: getting the next task for host managed_node3 11701 1727096128.32545: done getting next task for host managed_node3 11701 1727096128.32549: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11701 1727096128.32552: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.32572: getting variables 11701 1727096128.32574: in VariableManager get_vars() 11701 1727096128.32618: Calling all_inventory to load vars for managed_node3 11701 1727096128.32620: Calling groups_inventory to load vars for managed_node3 11701 1727096128.32622: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.32636: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.32638: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.32641: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.33384: done sending task result for task 0afff68d-5257-a05c-c957-00000000002f 11701 1727096128.33388: WORKER PROCESS EXITING 11701 1727096128.34427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.40690: done with get_vars() 11701 1727096128.40718: done getting variables 11701 1727096128.40765: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:28 -0400 (0:00:00.100) 0:00:12.372 ****** 11701 1727096128.40794: entering _queue_task() for managed_node3/package 11701 1727096128.41117: worker is 1 (out of 1 available) 11701 1727096128.41132: exiting _queue_task() for managed_node3/package 11701 1727096128.41145: done queuing things up, now waiting for results queue to drain 11701 1727096128.41147: waiting for pending results... 11701 1727096128.41729: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11701 1727096128.41979: in run() - task 0afff68d-5257-a05c-c957-000000000030 11701 1727096128.42000: variable 'ansible_search_path' from source: unknown 11701 1727096128.42007: variable 'ansible_search_path' from source: unknown 11701 1727096128.42049: calling self._execute() 11701 1727096128.42374: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.42378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.42381: variable 'omit' from source: magic vars 11701 1727096128.43060: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.43145: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.43385: variable 'network_state' from source: role '' defaults 11701 1727096128.43399: Evaluated conditional (network_state != {}): False 11701 1727096128.43408: when evaluation is False, skipping this task 11701 1727096128.43416: _execute() done 11701 1727096128.43423: dumping result to json 11701 1727096128.43431: done dumping result, returning 11701 1727096128.43465: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a05c-c957-000000000030] 11701 1727096128.43477: sending task result for task 0afff68d-5257-a05c-c957-000000000030 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096128.43741: no more pending results, returning what we have 11701 1727096128.43746: results queue empty 11701 1727096128.43746: checking for any_errors_fatal 11701 1727096128.43752: done checking for any_errors_fatal 11701 1727096128.43753: checking for max_fail_percentage 11701 1727096128.43755: done checking for max_fail_percentage 11701 1727096128.43756: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.43757: done checking to see if all hosts have failed 11701 1727096128.43758: getting the remaining hosts for this loop 11701 1727096128.43759: done getting the remaining hosts for this loop 11701 1727096128.43762: getting the next task for host managed_node3 11701 1727096128.43771: done getting next task for host managed_node3 11701 1727096128.43777: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11701 1727096128.43780: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.43797: getting variables 11701 1727096128.43799: in VariableManager get_vars() 11701 1727096128.43844: Calling all_inventory to load vars for managed_node3 11701 1727096128.43847: Calling groups_inventory to load vars for managed_node3 11701 1727096128.43850: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.43862: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.43865: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.43973: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.44029: done sending task result for task 0afff68d-5257-a05c-c957-000000000030 11701 1727096128.44032: WORKER PROCESS EXITING 11701 1727096128.45495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.47041: done with get_vars() 11701 1727096128.47069: done getting variables 11701 1727096128.47165: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:28 -0400 (0:00:00.064) 0:00:12.436 ****** 11701 1727096128.47197: entering _queue_task() for managed_node3/service 11701 1727096128.47198: Creating lock for service 11701 1727096128.47517: worker is 1 (out of 1 available) 11701 1727096128.47529: exiting _queue_task() for managed_node3/service 11701 1727096128.47543: done queuing things up, now waiting for results queue to drain 11701 1727096128.47544: waiting for pending results... 11701 1727096128.48016: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11701 1727096128.48375: in run() - task 0afff68d-5257-a05c-c957-000000000031 11701 1727096128.48397: variable 'ansible_search_path' from source: unknown 11701 1727096128.48405: variable 'ansible_search_path' from source: unknown 11701 1727096128.48444: calling self._execute() 11701 1727096128.48653: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.48664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.48680: variable 'omit' from source: magic vars 11701 1727096128.49448: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.49464: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.49748: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.50083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096128.52680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096128.53497: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096128.53545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096128.53585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096128.53614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096128.53701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.53734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.53770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.53813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.53832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.53887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.53916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.53943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.53987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.54006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.54047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.54172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.54176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.54178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.54181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.54336: variable 'network_connections' from source: task vars 11701 1727096128.54353: variable 'controller_profile' from source: play vars 11701 1727096128.54430: variable 'controller_profile' from source: play vars 11701 1727096128.54445: variable 'controller_device' from source: play vars 11701 1727096128.54511: variable 'controller_device' from source: play vars 11701 1727096128.54530: variable 'port1_profile' from source: play vars 11701 1727096128.54591: variable 'port1_profile' from source: play vars 11701 1727096128.54605: variable 'dhcp_interface1' from source: play vars 11701 1727096128.54672: variable 'dhcp_interface1' from source: play vars 11701 1727096128.54684: variable 'controller_profile' from source: play vars 11701 1727096128.54748: variable 'controller_profile' from source: play vars 11701 1727096128.54760: variable 'port2_profile' from source: play vars 11701 1727096128.54815: variable 'port2_profile' from source: play vars 11701 1727096128.54826: variable 'dhcp_interface2' from source: play vars 11701 1727096128.54884: variable 'dhcp_interface2' from source: play vars 11701 1727096128.54895: variable 'controller_profile' from source: play vars 11701 1727096128.54973: variable 'controller_profile' from source: play vars 11701 1727096128.55024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096128.55213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096128.55327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096128.55370: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096128.55474: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096128.55477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096128.55495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096128.55525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.55555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096128.55638: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096128.56076: variable 'network_connections' from source: task vars 11701 1727096128.56174: variable 'controller_profile' from source: play vars 11701 1727096128.56291: variable 'controller_profile' from source: play vars 11701 1727096128.56374: variable 'controller_device' from source: play vars 11701 1727096128.56575: variable 'controller_device' from source: play vars 11701 1727096128.56578: variable 'port1_profile' from source: play vars 11701 1727096128.56581: variable 'port1_profile' from source: play vars 11701 1727096128.56583: variable 'dhcp_interface1' from source: play vars 11701 1727096128.56710: variable 'dhcp_interface1' from source: play vars 11701 1727096128.56721: variable 'controller_profile' from source: play vars 11701 1727096128.56873: variable 'controller_profile' from source: play vars 11701 1727096128.56882: variable 'port2_profile' from source: play vars 11701 1727096128.56947: variable 'port2_profile' from source: play vars 11701 1727096128.57018: variable 'dhcp_interface2' from source: play vars 11701 1727096128.57112: variable 'dhcp_interface2' from source: play vars 11701 1727096128.57131: variable 'controller_profile' from source: play vars 11701 1727096128.57202: variable 'controller_profile' from source: play vars 11701 1727096128.57247: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096128.57255: when evaluation is False, skipping this task 11701 1727096128.57263: _execute() done 11701 1727096128.57272: dumping result to json 11701 1727096128.57279: done dumping result, returning 11701 1727096128.57291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a05c-c957-000000000031] 11701 1727096128.57301: sending task result for task 0afff68d-5257-a05c-c957-000000000031 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096128.57490: no more pending results, returning what we have 11701 1727096128.57494: results queue empty 11701 1727096128.57495: checking for any_errors_fatal 11701 1727096128.57501: done checking for any_errors_fatal 11701 1727096128.57502: checking for max_fail_percentage 11701 1727096128.57504: done checking for max_fail_percentage 11701 1727096128.57505: checking to see if all hosts have failed and the running result is not ok 11701 1727096128.57506: done checking to see if all hosts have failed 11701 1727096128.57507: getting the remaining hosts for this loop 11701 1727096128.57508: done getting the remaining hosts for this loop 11701 1727096128.57512: getting the next task for host managed_node3 11701 1727096128.57520: done getting next task for host managed_node3 11701 1727096128.57524: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11701 1727096128.57527: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096128.57542: getting variables 11701 1727096128.57544: in VariableManager get_vars() 11701 1727096128.57590: Calling all_inventory to load vars for managed_node3 11701 1727096128.57592: Calling groups_inventory to load vars for managed_node3 11701 1727096128.57596: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096128.57608: Calling all_plugins_play to load vars for managed_node3 11701 1727096128.57611: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096128.57614: Calling groups_plugins_play to load vars for managed_node3 11701 1727096128.58281: done sending task result for task 0afff68d-5257-a05c-c957-000000000031 11701 1727096128.58284: WORKER PROCESS EXITING 11701 1727096128.59425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096128.61783: done with get_vars() 11701 1727096128.61820: done getting variables 11701 1727096128.61885: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:28 -0400 (0:00:00.147) 0:00:12.583 ****** 11701 1727096128.61919: entering _queue_task() for managed_node3/service 11701 1727096128.62275: worker is 1 (out of 1 available) 11701 1727096128.62289: exiting _queue_task() for managed_node3/service 11701 1727096128.62301: done queuing things up, now waiting for results queue to drain 11701 1727096128.62303: waiting for pending results... 11701 1727096128.62599: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11701 1727096128.62725: in run() - task 0afff68d-5257-a05c-c957-000000000032 11701 1727096128.62753: variable 'ansible_search_path' from source: unknown 11701 1727096128.62762: variable 'ansible_search_path' from source: unknown 11701 1727096128.62804: calling self._execute() 11701 1727096128.62902: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.62914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.62929: variable 'omit' from source: magic vars 11701 1727096128.63303: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.63319: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096128.63487: variable 'network_provider' from source: set_fact 11701 1727096128.63557: variable 'network_state' from source: role '' defaults 11701 1727096128.63628: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11701 1727096128.63737: variable 'omit' from source: magic vars 11701 1727096128.64075: variable 'omit' from source: magic vars 11701 1727096128.64079: variable 'network_service_name' from source: role '' defaults 11701 1727096128.64131: variable 'network_service_name' from source: role '' defaults 11701 1727096128.64440: variable '__network_provider_setup' from source: role '' defaults 11701 1727096128.64673: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096128.64677: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096128.64679: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096128.64682: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096128.65013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096128.68329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096128.68424: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096128.68472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096128.68519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096128.68553: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096128.68649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.68692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.68728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.68780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.68800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.68859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.68891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.69383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.69386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.69388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.69891: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11701 1727096128.70134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.70171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.70276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.70342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.70494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.70684: variable 'ansible_python' from source: facts 11701 1727096128.70687: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11701 1727096128.70821: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096128.70912: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096128.71062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.71094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.71128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.71172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.71191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.71245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096128.71285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096128.71312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.71444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096128.71448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096128.71522: variable 'network_connections' from source: task vars 11701 1727096128.71534: variable 'controller_profile' from source: play vars 11701 1727096128.71618: variable 'controller_profile' from source: play vars 11701 1727096128.71634: variable 'controller_device' from source: play vars 11701 1727096128.71715: variable 'controller_device' from source: play vars 11701 1727096128.71732: variable 'port1_profile' from source: play vars 11701 1727096128.71814: variable 'port1_profile' from source: play vars 11701 1727096128.71831: variable 'dhcp_interface1' from source: play vars 11701 1727096128.71910: variable 'dhcp_interface1' from source: play vars 11701 1727096128.71924: variable 'controller_profile' from source: play vars 11701 1727096128.72004: variable 'controller_profile' from source: play vars 11701 1727096128.72019: variable 'port2_profile' from source: play vars 11701 1727096128.72096: variable 'port2_profile' from source: play vars 11701 1727096128.72111: variable 'dhcp_interface2' from source: play vars 11701 1727096128.72185: variable 'dhcp_interface2' from source: play vars 11701 1727096128.72206: variable 'controller_profile' from source: play vars 11701 1727096128.72310: variable 'controller_profile' from source: play vars 11701 1727096128.72392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096128.72612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096128.72672: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096128.72718: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096128.72770: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096128.72853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096128.72879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096128.72962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096128.72965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096128.73007: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.73268: variable 'network_connections' from source: task vars 11701 1727096128.73289: variable 'controller_profile' from source: play vars 11701 1727096128.73365: variable 'controller_profile' from source: play vars 11701 1727096128.73386: variable 'controller_device' from source: play vars 11701 1727096128.73470: variable 'controller_device' from source: play vars 11701 1727096128.73503: variable 'port1_profile' from source: play vars 11701 1727096128.73575: variable 'port1_profile' from source: play vars 11701 1727096128.73612: variable 'dhcp_interface1' from source: play vars 11701 1727096128.73675: variable 'dhcp_interface1' from source: play vars 11701 1727096128.73691: variable 'controller_profile' from source: play vars 11701 1727096128.73831: variable 'controller_profile' from source: play vars 11701 1727096128.73834: variable 'port2_profile' from source: play vars 11701 1727096128.73866: variable 'port2_profile' from source: play vars 11701 1727096128.73885: variable 'dhcp_interface2' from source: play vars 11701 1727096128.73971: variable 'dhcp_interface2' from source: play vars 11701 1727096128.73988: variable 'controller_profile' from source: play vars 11701 1727096128.74065: variable 'controller_profile' from source: play vars 11701 1727096128.74125: variable '__network_packages_default_wireless' from source: role '' defaults 11701 1727096128.74220: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096128.74545: variable 'network_connections' from source: task vars 11701 1727096128.74565: variable 'controller_profile' from source: play vars 11701 1727096128.74676: variable 'controller_profile' from source: play vars 11701 1727096128.74679: variable 'controller_device' from source: play vars 11701 1727096128.74729: variable 'controller_device' from source: play vars 11701 1727096128.74743: variable 'port1_profile' from source: play vars 11701 1727096128.74825: variable 'port1_profile' from source: play vars 11701 1727096128.74837: variable 'dhcp_interface1' from source: play vars 11701 1727096128.74907: variable 'dhcp_interface1' from source: play vars 11701 1727096128.75001: variable 'controller_profile' from source: play vars 11701 1727096128.75004: variable 'controller_profile' from source: play vars 11701 1727096128.75006: variable 'port2_profile' from source: play vars 11701 1727096128.75061: variable 'port2_profile' from source: play vars 11701 1727096128.75080: variable 'dhcp_interface2' from source: play vars 11701 1727096128.75161: variable 'dhcp_interface2' from source: play vars 11701 1727096128.75176: variable 'controller_profile' from source: play vars 11701 1727096128.75257: variable 'controller_profile' from source: play vars 11701 1727096128.75289: variable '__network_packages_default_team' from source: role '' defaults 11701 1727096128.75372: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096128.75706: variable 'network_connections' from source: task vars 11701 1727096128.75716: variable 'controller_profile' from source: play vars 11701 1727096128.75796: variable 'controller_profile' from source: play vars 11701 1727096128.75807: variable 'controller_device' from source: play vars 11701 1727096128.75883: variable 'controller_device' from source: play vars 11701 1727096128.75901: variable 'port1_profile' from source: play vars 11701 1727096128.75973: variable 'port1_profile' from source: play vars 11701 1727096128.76009: variable 'dhcp_interface1' from source: play vars 11701 1727096128.76066: variable 'dhcp_interface1' from source: play vars 11701 1727096128.76089: variable 'controller_profile' from source: play vars 11701 1727096128.76158: variable 'controller_profile' from source: play vars 11701 1727096128.76198: variable 'port2_profile' from source: play vars 11701 1727096128.76255: variable 'port2_profile' from source: play vars 11701 1727096128.76273: variable 'dhcp_interface2' from source: play vars 11701 1727096128.76359: variable 'dhcp_interface2' from source: play vars 11701 1727096128.76417: variable 'controller_profile' from source: play vars 11701 1727096128.76455: variable 'controller_profile' from source: play vars 11701 1727096128.76530: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096128.76604: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096128.76616: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096128.76687: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096128.76923: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11701 1727096128.77501: variable 'network_connections' from source: task vars 11701 1727096128.77504: variable 'controller_profile' from source: play vars 11701 1727096128.77516: variable 'controller_profile' from source: play vars 11701 1727096128.77531: variable 'controller_device' from source: play vars 11701 1727096128.77592: variable 'controller_device' from source: play vars 11701 1727096128.77609: variable 'port1_profile' from source: play vars 11701 1727096128.77675: variable 'port1_profile' from source: play vars 11701 1727096128.77717: variable 'dhcp_interface1' from source: play vars 11701 1727096128.77755: variable 'dhcp_interface1' from source: play vars 11701 1727096128.77766: variable 'controller_profile' from source: play vars 11701 1727096128.77836: variable 'controller_profile' from source: play vars 11701 1727096128.77853: variable 'port2_profile' from source: play vars 11701 1727096128.77913: variable 'port2_profile' from source: play vars 11701 1727096128.77935: variable 'dhcp_interface2' from source: play vars 11701 1727096128.78042: variable 'dhcp_interface2' from source: play vars 11701 1727096128.78045: variable 'controller_profile' from source: play vars 11701 1727096128.78079: variable 'controller_profile' from source: play vars 11701 1727096128.78091: variable 'ansible_distribution' from source: facts 11701 1727096128.78099: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.78108: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.78137: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11701 1727096128.78328: variable 'ansible_distribution' from source: facts 11701 1727096128.78374: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.78377: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.78379: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11701 1727096128.78542: variable 'ansible_distribution' from source: facts 11701 1727096128.78552: variable '__network_rh_distros' from source: role '' defaults 11701 1727096128.78562: variable 'ansible_distribution_major_version' from source: facts 11701 1727096128.78615: variable 'network_provider' from source: set_fact 11701 1727096128.78641: variable 'omit' from source: magic vars 11701 1727096128.78699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096128.78703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096128.78729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096128.78747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096128.78760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096128.78792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096128.78806: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.78808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.78932: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096128.78936: Set connection var ansible_timeout to 10 11701 1727096128.79072: Set connection var ansible_shell_type to sh 11701 1727096128.79076: Set connection var ansible_shell_executable to /bin/sh 11701 1727096128.79078: Set connection var ansible_connection to ssh 11701 1727096128.79080: Set connection var ansible_pipelining to False 11701 1727096128.79082: variable 'ansible_shell_executable' from source: unknown 11701 1727096128.79084: variable 'ansible_connection' from source: unknown 11701 1727096128.79086: variable 'ansible_module_compression' from source: unknown 11701 1727096128.79087: variable 'ansible_shell_type' from source: unknown 11701 1727096128.79089: variable 'ansible_shell_executable' from source: unknown 11701 1727096128.79091: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096128.79093: variable 'ansible_pipelining' from source: unknown 11701 1727096128.79095: variable 'ansible_timeout' from source: unknown 11701 1727096128.79097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096128.79152: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096128.79172: variable 'omit' from source: magic vars 11701 1727096128.79184: starting attempt loop 11701 1727096128.79190: running the handler 11701 1727096128.79281: variable 'ansible_facts' from source: unknown 11701 1727096128.80114: _low_level_execute_command(): starting 11701 1727096128.80128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096128.80974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096128.81003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096128.81084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096128.82775: stdout chunk (state=3): >>>/root <<< 11701 1727096128.82939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096128.82942: stdout chunk (state=3): >>><<< 11701 1727096128.82947: stderr chunk (state=3): >>><<< 11701 1727096128.83082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096128.83091: _low_level_execute_command(): starting 11701 1727096128.83095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106 `" && echo ansible-tmp-1727096128.8298807-12305-24709924242106="` echo /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106 `" ) && sleep 0' 11701 1727096128.83841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096128.83865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096128.83961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096128.85924: stdout chunk (state=3): >>>ansible-tmp-1727096128.8298807-12305-24709924242106=/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106 <<< 11701 1727096128.86039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096128.86071: stderr chunk (state=3): >>><<< 11701 1727096128.86074: stdout chunk (state=3): >>><<< 11701 1727096128.86092: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096128.8298807-12305-24709924242106=/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096128.86274: variable 'ansible_module_compression' from source: unknown 11701 1727096128.86279: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11701 1727096128.86283: ANSIBALLZ: Acquiring lock 11701 1727096128.86285: ANSIBALLZ: Lock acquired: 139907404354416 11701 1727096128.86288: ANSIBALLZ: Creating module 11701 1727096129.27190: ANSIBALLZ: Writing module into payload 11701 1727096129.27361: ANSIBALLZ: Writing module 11701 1727096129.27398: ANSIBALLZ: Renaming module 11701 1727096129.27411: ANSIBALLZ: Done creating module 11701 1727096129.27440: variable 'ansible_facts' from source: unknown 11701 1727096129.27617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py 11701 1727096129.27960: Sending initial data 11701 1727096129.27963: Sent initial data (155 bytes) 11701 1727096129.29275: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096129.29281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.29890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096129.31319: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096129.31353: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096129.31502: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpxppksah2 /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py <<< 11701 1727096129.31506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py" <<< 11701 1727096129.31539: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpxppksah2" to remote "/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py" <<< 11701 1727096129.33945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096129.33972: stderr chunk (state=3): >>><<< 11701 1727096129.33975: stdout chunk (state=3): >>><<< 11701 1727096129.34081: done transferring module to remote 11701 1727096129.34091: _low_level_execute_command(): starting 11701 1727096129.34096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/ /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py && sleep 0' 11701 1727096129.35074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096129.35078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096129.35086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096129.35094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096129.35110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096129.35125: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096129.35311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.35314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096129.35316: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096129.35318: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096129.35319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096129.35321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096129.35323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096129.35325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096129.35326: stderr chunk (state=3): >>>debug2: match found <<< 11701 1727096129.35328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.35333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096129.35334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096129.35378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096129.35422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096129.37299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096129.37427: stderr chunk (state=3): >>><<< 11701 1727096129.37623: stdout chunk (state=3): >>><<< 11701 1727096129.37626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096129.37629: _low_level_execute_command(): starting 11701 1727096129.37632: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/AnsiballZ_systemd.py && sleep 0' 11701 1727096129.38597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096129.38600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096129.38602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096129.38605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096129.38607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096129.38610: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.38648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096129.38662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096129.38690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096129.38882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096129.68631: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10297344", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322396672", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "598260000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11701 1727096129.68707: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11701 1727096129.70779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096129.70783: stdout chunk (state=3): >>><<< 11701 1727096129.70786: stderr chunk (state=3): >>><<< 11701 1727096129.70804: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10297344", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322396672", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "598260000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096129.71157: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096129.71185: _low_level_execute_command(): starting 11701 1727096129.71190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096128.8298807-12305-24709924242106/ > /dev/null 2>&1 && sleep 0' 11701 1727096129.71823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096129.71834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096129.71845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096129.71862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096129.71877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096129.71884: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096129.71894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.71925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096129.72004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096129.72021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096129.72069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096129.74159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096129.74191: stdout chunk (state=3): >>><<< 11701 1727096129.74195: stderr chunk (state=3): >>><<< 11701 1727096129.74326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096129.74330: handler run complete 11701 1727096129.74332: attempt loop complete, returning result 11701 1727096129.74335: _execute() done 11701 1727096129.74337: dumping result to json 11701 1727096129.74339: done dumping result, returning 11701 1727096129.74341: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a05c-c957-000000000032] 11701 1727096129.74343: sending task result for task 0afff68d-5257-a05c-c957-000000000032 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096129.74822: no more pending results, returning what we have 11701 1727096129.74827: results queue empty 11701 1727096129.74828: checking for any_errors_fatal 11701 1727096129.74833: done checking for any_errors_fatal 11701 1727096129.74834: checking for max_fail_percentage 11701 1727096129.74835: done checking for max_fail_percentage 11701 1727096129.74836: checking to see if all hosts have failed and the running result is not ok 11701 1727096129.74837: done checking to see if all hosts have failed 11701 1727096129.74838: getting the remaining hosts for this loop 11701 1727096129.74839: done getting the remaining hosts for this loop 11701 1727096129.74843: getting the next task for host managed_node3 11701 1727096129.74853: done getting next task for host managed_node3 11701 1727096129.74857: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11701 1727096129.74860: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096129.74873: getting variables 11701 1727096129.74876: in VariableManager get_vars() 11701 1727096129.74916: Calling all_inventory to load vars for managed_node3 11701 1727096129.74919: Calling groups_inventory to load vars for managed_node3 11701 1727096129.74922: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096129.74933: Calling all_plugins_play to load vars for managed_node3 11701 1727096129.74936: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096129.74939: Calling groups_plugins_play to load vars for managed_node3 11701 1727096129.75626: done sending task result for task 0afff68d-5257-a05c-c957-000000000032 11701 1727096129.75629: WORKER PROCESS EXITING 11701 1727096129.77098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096129.80040: done with get_vars() 11701 1727096129.80069: done getting variables 11701 1727096129.80140: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:29 -0400 (0:00:01.182) 0:00:13.766 ****** 11701 1727096129.80182: entering _queue_task() for managed_node3/service 11701 1727096129.80625: worker is 1 (out of 1 available) 11701 1727096129.80639: exiting _queue_task() for managed_node3/service 11701 1727096129.80649: done queuing things up, now waiting for results queue to drain 11701 1727096129.80653: waiting for pending results... 11701 1727096129.80853: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11701 1727096129.80991: in run() - task 0afff68d-5257-a05c-c957-000000000033 11701 1727096129.81016: variable 'ansible_search_path' from source: unknown 11701 1727096129.81025: variable 'ansible_search_path' from source: unknown 11701 1727096129.81070: calling self._execute() 11701 1727096129.81168: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096129.81181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096129.81196: variable 'omit' from source: magic vars 11701 1727096129.81584: variable 'ansible_distribution_major_version' from source: facts 11701 1727096129.81602: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096129.81723: variable 'network_provider' from source: set_fact 11701 1727096129.81734: Evaluated conditional (network_provider == "nm"): True 11701 1727096129.81840: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096129.82173: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096129.82176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096129.84112: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096129.84162: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096129.84193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096129.84219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096129.84239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096129.84314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096129.84336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096129.84359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096129.84387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096129.84398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096129.84432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096129.84449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096129.84469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096129.84497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096129.84507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096129.84534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096129.84553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096129.84570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096129.84599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096129.84610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096129.84715: variable 'network_connections' from source: task vars 11701 1727096129.84725: variable 'controller_profile' from source: play vars 11701 1727096129.84779: variable 'controller_profile' from source: play vars 11701 1727096129.84789: variable 'controller_device' from source: play vars 11701 1727096129.84835: variable 'controller_device' from source: play vars 11701 1727096129.84843: variable 'port1_profile' from source: play vars 11701 1727096129.84887: variable 'port1_profile' from source: play vars 11701 1727096129.84893: variable 'dhcp_interface1' from source: play vars 11701 1727096129.84939: variable 'dhcp_interface1' from source: play vars 11701 1727096129.84944: variable 'controller_profile' from source: play vars 11701 1727096129.84988: variable 'controller_profile' from source: play vars 11701 1727096129.84994: variable 'port2_profile' from source: play vars 11701 1727096129.85039: variable 'port2_profile' from source: play vars 11701 1727096129.85045: variable 'dhcp_interface2' from source: play vars 11701 1727096129.85090: variable 'dhcp_interface2' from source: play vars 11701 1727096129.85096: variable 'controller_profile' from source: play vars 11701 1727096129.85140: variable 'controller_profile' from source: play vars 11701 1727096129.85192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096129.85307: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096129.85335: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096129.85362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096129.85405: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096129.85430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096129.85478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096129.85489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096129.85585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096129.85592: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096129.85973: variable 'network_connections' from source: task vars 11701 1727096129.85977: variable 'controller_profile' from source: play vars 11701 1727096129.85979: variable 'controller_profile' from source: play vars 11701 1727096129.85981: variable 'controller_device' from source: play vars 11701 1727096129.85983: variable 'controller_device' from source: play vars 11701 1727096129.85985: variable 'port1_profile' from source: play vars 11701 1727096129.86009: variable 'port1_profile' from source: play vars 11701 1727096129.86012: variable 'dhcp_interface1' from source: play vars 11701 1727096129.86064: variable 'dhcp_interface1' from source: play vars 11701 1727096129.86073: variable 'controller_profile' from source: play vars 11701 1727096129.86129: variable 'controller_profile' from source: play vars 11701 1727096129.86136: variable 'port2_profile' from source: play vars 11701 1727096129.86195: variable 'port2_profile' from source: play vars 11701 1727096129.86202: variable 'dhcp_interface2' from source: play vars 11701 1727096129.86259: variable 'dhcp_interface2' from source: play vars 11701 1727096129.86324: variable 'controller_profile' from source: play vars 11701 1727096129.86327: variable 'controller_profile' from source: play vars 11701 1727096129.86365: Evaluated conditional (__network_wpa_supplicant_required): False 11701 1727096129.86370: when evaluation is False, skipping this task 11701 1727096129.86372: _execute() done 11701 1727096129.86375: dumping result to json 11701 1727096129.86379: done dumping result, returning 11701 1727096129.86388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a05c-c957-000000000033] 11701 1727096129.86393: sending task result for task 0afff68d-5257-a05c-c957-000000000033 11701 1727096129.86494: done sending task result for task 0afff68d-5257-a05c-c957-000000000033 11701 1727096129.86497: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11701 1727096129.86590: no more pending results, returning what we have 11701 1727096129.86594: results queue empty 11701 1727096129.86595: checking for any_errors_fatal 11701 1727096129.86613: done checking for any_errors_fatal 11701 1727096129.86614: checking for max_fail_percentage 11701 1727096129.86615: done checking for max_fail_percentage 11701 1727096129.86616: checking to see if all hosts have failed and the running result is not ok 11701 1727096129.86617: done checking to see if all hosts have failed 11701 1727096129.86617: getting the remaining hosts for this loop 11701 1727096129.86619: done getting the remaining hosts for this loop 11701 1727096129.86622: getting the next task for host managed_node3 11701 1727096129.86630: done getting next task for host managed_node3 11701 1727096129.86634: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11701 1727096129.86636: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096129.86653: getting variables 11701 1727096129.86655: in VariableManager get_vars() 11701 1727096129.86693: Calling all_inventory to load vars for managed_node3 11701 1727096129.86695: Calling groups_inventory to load vars for managed_node3 11701 1727096129.86697: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096129.86706: Calling all_plugins_play to load vars for managed_node3 11701 1727096129.86708: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096129.86710: Calling groups_plugins_play to load vars for managed_node3 11701 1727096129.87624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096129.88500: done with get_vars() 11701 1727096129.88523: done getting variables 11701 1727096129.88571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:29 -0400 (0:00:00.084) 0:00:13.850 ****** 11701 1727096129.88595: entering _queue_task() for managed_node3/service 11701 1727096129.88851: worker is 1 (out of 1 available) 11701 1727096129.88866: exiting _queue_task() for managed_node3/service 11701 1727096129.88878: done queuing things up, now waiting for results queue to drain 11701 1727096129.88879: waiting for pending results... 11701 1727096129.89073: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11701 1727096129.89181: in run() - task 0afff68d-5257-a05c-c957-000000000034 11701 1727096129.89195: variable 'ansible_search_path' from source: unknown 11701 1727096129.89199: variable 'ansible_search_path' from source: unknown 11701 1727096129.89246: calling self._execute() 11701 1727096129.89480: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096129.89485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096129.89488: variable 'omit' from source: magic vars 11701 1727096129.89711: variable 'ansible_distribution_major_version' from source: facts 11701 1727096129.89718: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096129.89831: variable 'network_provider' from source: set_fact 11701 1727096129.89835: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096129.89856: when evaluation is False, skipping this task 11701 1727096129.89860: _execute() done 11701 1727096129.89862: dumping result to json 11701 1727096129.89864: done dumping result, returning 11701 1727096129.89866: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a05c-c957-000000000034] 11701 1727096129.89871: sending task result for task 0afff68d-5257-a05c-c957-000000000034 11701 1727096129.89989: done sending task result for task 0afff68d-5257-a05c-c957-000000000034 11701 1727096129.89992: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096129.90070: no more pending results, returning what we have 11701 1727096129.90075: results queue empty 11701 1727096129.90076: checking for any_errors_fatal 11701 1727096129.90085: done checking for any_errors_fatal 11701 1727096129.90086: checking for max_fail_percentage 11701 1727096129.90088: done checking for max_fail_percentage 11701 1727096129.90129: checking to see if all hosts have failed and the running result is not ok 11701 1727096129.90131: done checking to see if all hosts have failed 11701 1727096129.90132: getting the remaining hosts for this loop 11701 1727096129.90134: done getting the remaining hosts for this loop 11701 1727096129.90138: getting the next task for host managed_node3 11701 1727096129.90147: done getting next task for host managed_node3 11701 1727096129.90154: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11701 1727096129.90159: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096129.90178: getting variables 11701 1727096129.90180: in VariableManager get_vars() 11701 1727096129.90225: Calling all_inventory to load vars for managed_node3 11701 1727096129.90229: Calling groups_inventory to load vars for managed_node3 11701 1727096129.90378: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096129.90391: Calling all_plugins_play to load vars for managed_node3 11701 1727096129.90394: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096129.90397: Calling groups_plugins_play to load vars for managed_node3 11701 1727096129.91575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096129.92475: done with get_vars() 11701 1727096129.92503: done getting variables 11701 1727096129.92561: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:29 -0400 (0:00:00.040) 0:00:13.890 ****** 11701 1727096129.92602: entering _queue_task() for managed_node3/copy 11701 1727096129.92957: worker is 1 (out of 1 available) 11701 1727096129.93172: exiting _queue_task() for managed_node3/copy 11701 1727096129.93184: done queuing things up, now waiting for results queue to drain 11701 1727096129.93186: waiting for pending results... 11701 1727096129.93387: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11701 1727096129.93430: in run() - task 0afff68d-5257-a05c-c957-000000000035 11701 1727096129.93457: variable 'ansible_search_path' from source: unknown 11701 1727096129.93465: variable 'ansible_search_path' from source: unknown 11701 1727096129.93509: calling self._execute() 11701 1727096129.93628: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096129.93632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096129.93737: variable 'omit' from source: magic vars 11701 1727096129.94020: variable 'ansible_distribution_major_version' from source: facts 11701 1727096129.94031: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096129.94130: variable 'network_provider' from source: set_fact 11701 1727096129.94134: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096129.94137: when evaluation is False, skipping this task 11701 1727096129.94140: _execute() done 11701 1727096129.94142: dumping result to json 11701 1727096129.94145: done dumping result, returning 11701 1727096129.94165: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a05c-c957-000000000035] 11701 1727096129.94194: sending task result for task 0afff68d-5257-a05c-c957-000000000035 11701 1727096129.94271: done sending task result for task 0afff68d-5257-a05c-c957-000000000035 11701 1727096129.94274: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11701 1727096129.94340: no more pending results, returning what we have 11701 1727096129.94344: results queue empty 11701 1727096129.94345: checking for any_errors_fatal 11701 1727096129.94352: done checking for any_errors_fatal 11701 1727096129.94353: checking for max_fail_percentage 11701 1727096129.94355: done checking for max_fail_percentage 11701 1727096129.94355: checking to see if all hosts have failed and the running result is not ok 11701 1727096129.94357: done checking to see if all hosts have failed 11701 1727096129.94357: getting the remaining hosts for this loop 11701 1727096129.94358: done getting the remaining hosts for this loop 11701 1727096129.94362: getting the next task for host managed_node3 11701 1727096129.94370: done getting next task for host managed_node3 11701 1727096129.94374: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11701 1727096129.94377: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096129.94393: getting variables 11701 1727096129.94395: in VariableManager get_vars() 11701 1727096129.94434: Calling all_inventory to load vars for managed_node3 11701 1727096129.94436: Calling groups_inventory to load vars for managed_node3 11701 1727096129.94438: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096129.94447: Calling all_plugins_play to load vars for managed_node3 11701 1727096129.94452: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096129.94455: Calling groups_plugins_play to load vars for managed_node3 11701 1727096129.95824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096129.96695: done with get_vars() 11701 1727096129.96718: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:29 -0400 (0:00:00.041) 0:00:13.932 ****** 11701 1727096129.96787: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11701 1727096129.96788: Creating lock for fedora.linux_system_roles.network_connections 11701 1727096129.97053: worker is 1 (out of 1 available) 11701 1727096129.97068: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11701 1727096129.97079: done queuing things up, now waiting for results queue to drain 11701 1727096129.97080: waiting for pending results... 11701 1727096129.97260: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11701 1727096129.97357: in run() - task 0afff68d-5257-a05c-c957-000000000036 11701 1727096129.97372: variable 'ansible_search_path' from source: unknown 11701 1727096129.97375: variable 'ansible_search_path' from source: unknown 11701 1727096129.97418: calling self._execute() 11701 1727096129.97510: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096129.97514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096129.97520: variable 'omit' from source: magic vars 11701 1727096129.97899: variable 'ansible_distribution_major_version' from source: facts 11701 1727096129.97916: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096129.97921: variable 'omit' from source: magic vars 11701 1727096129.98007: variable 'omit' from source: magic vars 11701 1727096129.98130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096130.00006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096130.00049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096130.00080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096130.00108: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096130.00128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096130.00192: variable 'network_provider' from source: set_fact 11701 1727096130.00295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096130.00321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096130.00337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096130.00365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096130.00378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096130.00433: variable 'omit' from source: magic vars 11701 1727096130.00516: variable 'omit' from source: magic vars 11701 1727096130.00591: variable 'network_connections' from source: task vars 11701 1727096130.00600: variable 'controller_profile' from source: play vars 11701 1727096130.00646: variable 'controller_profile' from source: play vars 11701 1727096130.00651: variable 'controller_device' from source: play vars 11701 1727096130.00697: variable 'controller_device' from source: play vars 11701 1727096130.00705: variable 'port1_profile' from source: play vars 11701 1727096130.00748: variable 'port1_profile' from source: play vars 11701 1727096130.00751: variable 'dhcp_interface1' from source: play vars 11701 1727096130.00799: variable 'dhcp_interface1' from source: play vars 11701 1727096130.00804: variable 'controller_profile' from source: play vars 11701 1727096130.00845: variable 'controller_profile' from source: play vars 11701 1727096130.00850: variable 'port2_profile' from source: play vars 11701 1727096130.00913: variable 'port2_profile' from source: play vars 11701 1727096130.00919: variable 'dhcp_interface2' from source: play vars 11701 1727096130.00962: variable 'dhcp_interface2' from source: play vars 11701 1727096130.00972: variable 'controller_profile' from source: play vars 11701 1727096130.01014: variable 'controller_profile' from source: play vars 11701 1727096130.01139: variable 'omit' from source: magic vars 11701 1727096130.01145: variable '__lsr_ansible_managed' from source: task vars 11701 1727096130.01193: variable '__lsr_ansible_managed' from source: task vars 11701 1727096130.01325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11701 1727096130.01472: Loaded config def from plugin (lookup/template) 11701 1727096130.01475: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11701 1727096130.01504: File lookup term: get_ansible_managed.j2 11701 1727096130.01507: variable 'ansible_search_path' from source: unknown 11701 1727096130.01512: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11701 1727096130.01525: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11701 1727096130.01539: variable 'ansible_search_path' from source: unknown 11701 1727096130.04771: variable 'ansible_managed' from source: unknown 11701 1727096130.04850: variable 'omit' from source: magic vars 11701 1727096130.04875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096130.04899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096130.04914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096130.04926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.04934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.04959: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096130.04962: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.04964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.05035: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096130.05038: Set connection var ansible_timeout to 10 11701 1727096130.05041: Set connection var ansible_shell_type to sh 11701 1727096130.05046: Set connection var ansible_shell_executable to /bin/sh 11701 1727096130.05058: Set connection var ansible_connection to ssh 11701 1727096130.05060: Set connection var ansible_pipelining to False 11701 1727096130.05079: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.05082: variable 'ansible_connection' from source: unknown 11701 1727096130.05084: variable 'ansible_module_compression' from source: unknown 11701 1727096130.05086: variable 'ansible_shell_type' from source: unknown 11701 1727096130.05088: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.05090: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.05094: variable 'ansible_pipelining' from source: unknown 11701 1727096130.05097: variable 'ansible_timeout' from source: unknown 11701 1727096130.05101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.05194: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096130.05202: variable 'omit' from source: magic vars 11701 1727096130.05209: starting attempt loop 11701 1727096130.05213: running the handler 11701 1727096130.05225: _low_level_execute_command(): starting 11701 1727096130.05231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096130.05739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.05743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.05745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.05747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.05804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.05807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.05809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.05857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.07565: stdout chunk (state=3): >>>/root <<< 11701 1727096130.07653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096130.07687: stderr chunk (state=3): >>><<< 11701 1727096130.07692: stdout chunk (state=3): >>><<< 11701 1727096130.07717: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096130.07726: _low_level_execute_command(): starting 11701 1727096130.07732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411 `" && echo ansible-tmp-1727096130.077165-12356-7321814993411="` echo /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411 `" ) && sleep 0' 11701 1727096130.08172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096130.08205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096130.08208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096130.08210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.08213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096130.08215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096130.08217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.08277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.08281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.08283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.08325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.10271: stdout chunk (state=3): >>>ansible-tmp-1727096130.077165-12356-7321814993411=/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411 <<< 11701 1727096130.10367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096130.10398: stderr chunk (state=3): >>><<< 11701 1727096130.10401: stdout chunk (state=3): >>><<< 11701 1727096130.10421: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096130.077165-12356-7321814993411=/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096130.10466: variable 'ansible_module_compression' from source: unknown 11701 1727096130.10505: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11701 1727096130.10509: ANSIBALLZ: Acquiring lock 11701 1727096130.10511: ANSIBALLZ: Lock acquired: 139907404610400 11701 1727096130.10514: ANSIBALLZ: Creating module 11701 1727096130.23886: ANSIBALLZ: Writing module into payload 11701 1727096130.24113: ANSIBALLZ: Writing module 11701 1727096130.24132: ANSIBALLZ: Renaming module 11701 1727096130.24138: ANSIBALLZ: Done creating module 11701 1727096130.24162: variable 'ansible_facts' from source: unknown 11701 1727096130.24230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py 11701 1727096130.24337: Sending initial data 11701 1727096130.24340: Sent initial data (165 bytes) 11701 1727096130.24806: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096130.24810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096130.24812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.24814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.24816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.24874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.24877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.24880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.24923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.26635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096130.26662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096130.26728: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpj_zmvwmh /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py <<< 11701 1727096130.26732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py" <<< 11701 1727096130.26770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpj_zmvwmh" to remote "/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py" <<< 11701 1727096130.27910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096130.28085: stderr chunk (state=3): >>><<< 11701 1727096130.28089: stdout chunk (state=3): >>><<< 11701 1727096130.28091: done transferring module to remote 11701 1727096130.28093: _low_level_execute_command(): starting 11701 1727096130.28095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/ /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py && sleep 0' 11701 1727096130.28832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096130.28835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096130.28837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.28839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096130.28884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.28907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.28920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.28941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.29004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.30857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096130.30879: stderr chunk (state=3): >>><<< 11701 1727096130.30882: stdout chunk (state=3): >>><<< 11701 1727096130.30897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096130.30900: _low_level_execute_command(): starting 11701 1727096130.30906: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/AnsiballZ_network_connections.py && sleep 0' 11701 1727096130.31344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.31348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.31354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096130.31357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096130.31360: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.31409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.31412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.31416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.31457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.76978: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11701 1727096130.79084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096130.79108: stderr chunk (state=3): >>><<< 11701 1727096130.79111: stdout chunk (state=3): >>><<< 11701 1727096130.79129: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096130.79179: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096130.79188: _low_level_execute_command(): starting 11701 1727096130.79190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096130.077165-12356-7321814993411/ > /dev/null 2>&1 && sleep 0' 11701 1727096130.79649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.79655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.79658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096130.79660: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096130.79662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096130.79714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096130.79717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096130.79719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096130.79760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096130.81857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096130.81886: stderr chunk (state=3): >>><<< 11701 1727096130.81890: stdout chunk (state=3): >>><<< 11701 1727096130.81904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096130.81910: handler run complete 11701 1727096130.81938: attempt loop complete, returning result 11701 1727096130.81941: _execute() done 11701 1727096130.81947: dumping result to json 11701 1727096130.81955: done dumping result, returning 11701 1727096130.81962: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a05c-c957-000000000036] 11701 1727096130.81966: sending task result for task 0afff68d-5257-a05c-c957-000000000036 11701 1727096130.82087: done sending task result for task 0afff68d-5257-a05c-c957-000000000036 11701 1727096130.82089: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active) 11701 1727096130.82212: no more pending results, returning what we have 11701 1727096130.82215: results queue empty 11701 1727096130.82216: checking for any_errors_fatal 11701 1727096130.82221: done checking for any_errors_fatal 11701 1727096130.82222: checking for max_fail_percentage 11701 1727096130.82223: done checking for max_fail_percentage 11701 1727096130.82224: checking to see if all hosts have failed and the running result is not ok 11701 1727096130.82225: done checking to see if all hosts have failed 11701 1727096130.82226: getting the remaining hosts for this loop 11701 1727096130.82228: done getting the remaining hosts for this loop 11701 1727096130.82231: getting the next task for host managed_node3 11701 1727096130.82237: done getting next task for host managed_node3 11701 1727096130.82242: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11701 1727096130.82244: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096130.82256: getting variables 11701 1727096130.82258: in VariableManager get_vars() 11701 1727096130.82304: Calling all_inventory to load vars for managed_node3 11701 1727096130.82307: Calling groups_inventory to load vars for managed_node3 11701 1727096130.82309: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096130.82318: Calling all_plugins_play to load vars for managed_node3 11701 1727096130.82321: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096130.82323: Calling groups_plugins_play to load vars for managed_node3 11701 1727096130.83278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096130.84146: done with get_vars() 11701 1727096130.84172: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:30 -0400 (0:00:00.874) 0:00:14.807 ****** 11701 1727096130.84241: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11701 1727096130.84242: Creating lock for fedora.linux_system_roles.network_state 11701 1727096130.84514: worker is 1 (out of 1 available) 11701 1727096130.84528: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11701 1727096130.84539: done queuing things up, now waiting for results queue to drain 11701 1727096130.84541: waiting for pending results... 11701 1727096130.84723: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11701 1727096130.84807: in run() - task 0afff68d-5257-a05c-c957-000000000037 11701 1727096130.84819: variable 'ansible_search_path' from source: unknown 11701 1727096130.84822: variable 'ansible_search_path' from source: unknown 11701 1727096130.84853: calling self._execute() 11701 1727096130.84924: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.84928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.84938: variable 'omit' from source: magic vars 11701 1727096130.85220: variable 'ansible_distribution_major_version' from source: facts 11701 1727096130.85230: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096130.85319: variable 'network_state' from source: role '' defaults 11701 1727096130.85326: Evaluated conditional (network_state != {}): False 11701 1727096130.85329: when evaluation is False, skipping this task 11701 1727096130.85332: _execute() done 11701 1727096130.85335: dumping result to json 11701 1727096130.85337: done dumping result, returning 11701 1727096130.85344: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a05c-c957-000000000037] 11701 1727096130.85347: sending task result for task 0afff68d-5257-a05c-c957-000000000037 11701 1727096130.85433: done sending task result for task 0afff68d-5257-a05c-c957-000000000037 11701 1727096130.85436: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096130.85491: no more pending results, returning what we have 11701 1727096130.85495: results queue empty 11701 1727096130.85496: checking for any_errors_fatal 11701 1727096130.85509: done checking for any_errors_fatal 11701 1727096130.85509: checking for max_fail_percentage 11701 1727096130.85511: done checking for max_fail_percentage 11701 1727096130.85512: checking to see if all hosts have failed and the running result is not ok 11701 1727096130.85513: done checking to see if all hosts have failed 11701 1727096130.85514: getting the remaining hosts for this loop 11701 1727096130.85515: done getting the remaining hosts for this loop 11701 1727096130.85519: getting the next task for host managed_node3 11701 1727096130.85525: done getting next task for host managed_node3 11701 1727096130.85529: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11701 1727096130.85532: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096130.85549: getting variables 11701 1727096130.85552: in VariableManager get_vars() 11701 1727096130.85595: Calling all_inventory to load vars for managed_node3 11701 1727096130.85598: Calling groups_inventory to load vars for managed_node3 11701 1727096130.85600: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096130.85609: Calling all_plugins_play to load vars for managed_node3 11701 1727096130.85612: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096130.85614: Calling groups_plugins_play to load vars for managed_node3 11701 1727096130.86403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096130.87273: done with get_vars() 11701 1727096130.87297: done getting variables 11701 1727096130.87343: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:30 -0400 (0:00:00.031) 0:00:14.838 ****** 11701 1727096130.87373: entering _queue_task() for managed_node3/debug 11701 1727096130.87637: worker is 1 (out of 1 available) 11701 1727096130.87654: exiting _queue_task() for managed_node3/debug 11701 1727096130.87664: done queuing things up, now waiting for results queue to drain 11701 1727096130.87665: waiting for pending results... 11701 1727096130.87843: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11701 1727096130.87930: in run() - task 0afff68d-5257-a05c-c957-000000000038 11701 1727096130.87942: variable 'ansible_search_path' from source: unknown 11701 1727096130.87946: variable 'ansible_search_path' from source: unknown 11701 1727096130.87979: calling self._execute() 11701 1727096130.88050: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.88058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.88066: variable 'omit' from source: magic vars 11701 1727096130.88341: variable 'ansible_distribution_major_version' from source: facts 11701 1727096130.88351: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096130.88360: variable 'omit' from source: magic vars 11701 1727096130.88399: variable 'omit' from source: magic vars 11701 1727096130.88424: variable 'omit' from source: magic vars 11701 1727096130.88462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096130.88490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096130.88506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096130.88519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.88529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.88558: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096130.88561: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.88564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.88634: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096130.88637: Set connection var ansible_timeout to 10 11701 1727096130.88640: Set connection var ansible_shell_type to sh 11701 1727096130.88645: Set connection var ansible_shell_executable to /bin/sh 11701 1727096130.88648: Set connection var ansible_connection to ssh 11701 1727096130.88661: Set connection var ansible_pipelining to False 11701 1727096130.88680: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.88682: variable 'ansible_connection' from source: unknown 11701 1727096130.88685: variable 'ansible_module_compression' from source: unknown 11701 1727096130.88688: variable 'ansible_shell_type' from source: unknown 11701 1727096130.88690: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.88692: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.88695: variable 'ansible_pipelining' from source: unknown 11701 1727096130.88697: variable 'ansible_timeout' from source: unknown 11701 1727096130.88701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.88808: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096130.88817: variable 'omit' from source: magic vars 11701 1727096130.88822: starting attempt loop 11701 1727096130.88825: running the handler 11701 1727096130.88928: variable '__network_connections_result' from source: set_fact 11701 1727096130.88979: handler run complete 11701 1727096130.88994: attempt loop complete, returning result 11701 1727096130.88997: _execute() done 11701 1727096130.88999: dumping result to json 11701 1727096130.89002: done dumping result, returning 11701 1727096130.89011: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a05c-c957-000000000038] 11701 1727096130.89013: sending task result for task 0afff68d-5257-a05c-c957-000000000038 11701 1727096130.89097: done sending task result for task 0afff68d-5257-a05c-c957-000000000038 11701 1727096130.89100: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active)" ] } 11701 1727096130.89159: no more pending results, returning what we have 11701 1727096130.89162: results queue empty 11701 1727096130.89163: checking for any_errors_fatal 11701 1727096130.89169: done checking for any_errors_fatal 11701 1727096130.89170: checking for max_fail_percentage 11701 1727096130.89172: done checking for max_fail_percentage 11701 1727096130.89173: checking to see if all hosts have failed and the running result is not ok 11701 1727096130.89174: done checking to see if all hosts have failed 11701 1727096130.89174: getting the remaining hosts for this loop 11701 1727096130.89176: done getting the remaining hosts for this loop 11701 1727096130.89179: getting the next task for host managed_node3 11701 1727096130.89186: done getting next task for host managed_node3 11701 1727096130.89189: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11701 1727096130.89192: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096130.89202: getting variables 11701 1727096130.89204: in VariableManager get_vars() 11701 1727096130.89244: Calling all_inventory to load vars for managed_node3 11701 1727096130.89246: Calling groups_inventory to load vars for managed_node3 11701 1727096130.89248: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096130.89257: Calling all_plugins_play to load vars for managed_node3 11701 1727096130.89260: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096130.89263: Calling groups_plugins_play to load vars for managed_node3 11701 1727096130.90157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096130.91020: done with get_vars() 11701 1727096130.91045: done getting variables 11701 1727096130.91094: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:30 -0400 (0:00:00.037) 0:00:14.875 ****** 11701 1727096130.91128: entering _queue_task() for managed_node3/debug 11701 1727096130.91391: worker is 1 (out of 1 available) 11701 1727096130.91407: exiting _queue_task() for managed_node3/debug 11701 1727096130.91418: done queuing things up, now waiting for results queue to drain 11701 1727096130.91420: waiting for pending results... 11701 1727096130.91606: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11701 1727096130.91694: in run() - task 0afff68d-5257-a05c-c957-000000000039 11701 1727096130.91707: variable 'ansible_search_path' from source: unknown 11701 1727096130.91710: variable 'ansible_search_path' from source: unknown 11701 1727096130.91739: calling self._execute() 11701 1727096130.91813: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.91818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.91826: variable 'omit' from source: magic vars 11701 1727096130.92113: variable 'ansible_distribution_major_version' from source: facts 11701 1727096130.92123: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096130.92129: variable 'omit' from source: magic vars 11701 1727096130.92172: variable 'omit' from source: magic vars 11701 1727096130.92202: variable 'omit' from source: magic vars 11701 1727096130.92235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096130.92265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096130.92283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096130.92299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.92310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.92332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096130.92335: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.92337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.92413: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096130.92416: Set connection var ansible_timeout to 10 11701 1727096130.92419: Set connection var ansible_shell_type to sh 11701 1727096130.92428: Set connection var ansible_shell_executable to /bin/sh 11701 1727096130.92430: Set connection var ansible_connection to ssh 11701 1727096130.92436: Set connection var ansible_pipelining to False 11701 1727096130.92452: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.92458: variable 'ansible_connection' from source: unknown 11701 1727096130.92461: variable 'ansible_module_compression' from source: unknown 11701 1727096130.92463: variable 'ansible_shell_type' from source: unknown 11701 1727096130.92466: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.92469: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.92473: variable 'ansible_pipelining' from source: unknown 11701 1727096130.92475: variable 'ansible_timeout' from source: unknown 11701 1727096130.92480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.92588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096130.92596: variable 'omit' from source: magic vars 11701 1727096130.92601: starting attempt loop 11701 1727096130.92605: running the handler 11701 1727096130.92645: variable '__network_connections_result' from source: set_fact 11701 1727096130.92708: variable '__network_connections_result' from source: set_fact 11701 1727096130.92823: handler run complete 11701 1727096130.92844: attempt loop complete, returning result 11701 1727096130.92847: _execute() done 11701 1727096130.92850: dumping result to json 11701 1727096130.92862: done dumping result, returning 11701 1727096130.92866: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a05c-c957-000000000039] 11701 1727096130.92872: sending task result for task 0afff68d-5257-a05c-c957-000000000039 11701 1727096130.92970: done sending task result for task 0afff68d-5257-a05c-c957-000000000039 11701 1727096130.92974: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 857d53ff-7175-4f1a-9313-51a779b02f5c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 1356eebb-22d1-4dd0-adba-d2a9505d1fb4 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, cf85fda0-9a30-4802-92fc-47e5937048b2 (not-active)" ] } } 11701 1727096130.93064: no more pending results, returning what we have 11701 1727096130.93069: results queue empty 11701 1727096130.93077: checking for any_errors_fatal 11701 1727096130.93085: done checking for any_errors_fatal 11701 1727096130.93085: checking for max_fail_percentage 11701 1727096130.93087: done checking for max_fail_percentage 11701 1727096130.93088: checking to see if all hosts have failed and the running result is not ok 11701 1727096130.93089: done checking to see if all hosts have failed 11701 1727096130.93090: getting the remaining hosts for this loop 11701 1727096130.93091: done getting the remaining hosts for this loop 11701 1727096130.93095: getting the next task for host managed_node3 11701 1727096130.93101: done getting next task for host managed_node3 11701 1727096130.93105: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11701 1727096130.93107: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096130.93117: getting variables 11701 1727096130.93118: in VariableManager get_vars() 11701 1727096130.93154: Calling all_inventory to load vars for managed_node3 11701 1727096130.93156: Calling groups_inventory to load vars for managed_node3 11701 1727096130.93158: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096130.93166: Calling all_plugins_play to load vars for managed_node3 11701 1727096130.93176: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096130.93179: Calling groups_plugins_play to load vars for managed_node3 11701 1727096130.93956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096130.94929: done with get_vars() 11701 1727096130.94948: done getting variables 11701 1727096130.94996: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:30 -0400 (0:00:00.038) 0:00:14.914 ****** 11701 1727096130.95021: entering _queue_task() for managed_node3/debug 11701 1727096130.95284: worker is 1 (out of 1 available) 11701 1727096130.95299: exiting _queue_task() for managed_node3/debug 11701 1727096130.95310: done queuing things up, now waiting for results queue to drain 11701 1727096130.95312: waiting for pending results... 11701 1727096130.95496: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11701 1727096130.95585: in run() - task 0afff68d-5257-a05c-c957-00000000003a 11701 1727096130.95597: variable 'ansible_search_path' from source: unknown 11701 1727096130.95600: variable 'ansible_search_path' from source: unknown 11701 1727096130.95629: calling self._execute() 11701 1727096130.95703: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.95707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.95715: variable 'omit' from source: magic vars 11701 1727096130.95996: variable 'ansible_distribution_major_version' from source: facts 11701 1727096130.96006: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096130.96095: variable 'network_state' from source: role '' defaults 11701 1727096130.96103: Evaluated conditional (network_state != {}): False 11701 1727096130.96106: when evaluation is False, skipping this task 11701 1727096130.96109: _execute() done 11701 1727096130.96111: dumping result to json 11701 1727096130.96115: done dumping result, returning 11701 1727096130.96123: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a05c-c957-00000000003a] 11701 1727096130.96128: sending task result for task 0afff68d-5257-a05c-c957-00000000003a 11701 1727096130.96213: done sending task result for task 0afff68d-5257-a05c-c957-00000000003a 11701 1727096130.96216: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11701 1727096130.96260: no more pending results, returning what we have 11701 1727096130.96264: results queue empty 11701 1727096130.96265: checking for any_errors_fatal 11701 1727096130.96275: done checking for any_errors_fatal 11701 1727096130.96276: checking for max_fail_percentage 11701 1727096130.96277: done checking for max_fail_percentage 11701 1727096130.96278: checking to see if all hosts have failed and the running result is not ok 11701 1727096130.96279: done checking to see if all hosts have failed 11701 1727096130.96280: getting the remaining hosts for this loop 11701 1727096130.96281: done getting the remaining hosts for this loop 11701 1727096130.96284: getting the next task for host managed_node3 11701 1727096130.96291: done getting next task for host managed_node3 11701 1727096130.96295: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11701 1727096130.96298: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096130.96313: getting variables 11701 1727096130.96315: in VariableManager get_vars() 11701 1727096130.96356: Calling all_inventory to load vars for managed_node3 11701 1727096130.96359: Calling groups_inventory to load vars for managed_node3 11701 1727096130.96361: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096130.96374: Calling all_plugins_play to load vars for managed_node3 11701 1727096130.96377: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096130.96379: Calling groups_plugins_play to load vars for managed_node3 11701 1727096130.97170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096130.98031: done with get_vars() 11701 1727096130.98055: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:30 -0400 (0:00:00.031) 0:00:14.945 ****** 11701 1727096130.98131: entering _queue_task() for managed_node3/ping 11701 1727096130.98132: Creating lock for ping 11701 1727096130.98402: worker is 1 (out of 1 available) 11701 1727096130.98417: exiting _queue_task() for managed_node3/ping 11701 1727096130.98428: done queuing things up, now waiting for results queue to drain 11701 1727096130.98429: waiting for pending results... 11701 1727096130.98616: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11701 1727096130.98704: in run() - task 0afff68d-5257-a05c-c957-00000000003b 11701 1727096130.98716: variable 'ansible_search_path' from source: unknown 11701 1727096130.98720: variable 'ansible_search_path' from source: unknown 11701 1727096130.98750: calling self._execute() 11701 1727096130.98824: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.98828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.98837: variable 'omit' from source: magic vars 11701 1727096130.99130: variable 'ansible_distribution_major_version' from source: facts 11701 1727096130.99140: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096130.99146: variable 'omit' from source: magic vars 11701 1727096130.99191: variable 'omit' from source: magic vars 11701 1727096130.99220: variable 'omit' from source: magic vars 11701 1727096130.99256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096130.99286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096130.99304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096130.99319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.99330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096130.99354: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096130.99357: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.99362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.99437: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096130.99440: Set connection var ansible_timeout to 10 11701 1727096130.99443: Set connection var ansible_shell_type to sh 11701 1727096130.99448: Set connection var ansible_shell_executable to /bin/sh 11701 1727096130.99451: Set connection var ansible_connection to ssh 11701 1727096130.99460: Set connection var ansible_pipelining to False 11701 1727096130.99479: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.99481: variable 'ansible_connection' from source: unknown 11701 1727096130.99484: variable 'ansible_module_compression' from source: unknown 11701 1727096130.99486: variable 'ansible_shell_type' from source: unknown 11701 1727096130.99489: variable 'ansible_shell_executable' from source: unknown 11701 1727096130.99491: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096130.99495: variable 'ansible_pipelining' from source: unknown 11701 1727096130.99497: variable 'ansible_timeout' from source: unknown 11701 1727096130.99501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096130.99651: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096130.99663: variable 'omit' from source: magic vars 11701 1727096130.99669: starting attempt loop 11701 1727096130.99673: running the handler 11701 1727096130.99684: _low_level_execute_command(): starting 11701 1727096130.99691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096131.00217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.00222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.00227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.00275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.00278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.00289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.00332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.02131: stdout chunk (state=3): >>>/root <<< 11701 1727096131.02246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.02282: stderr chunk (state=3): >>><<< 11701 1727096131.02286: stdout chunk (state=3): >>><<< 11701 1727096131.02306: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.02319: _low_level_execute_command(): starting 11701 1727096131.02325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235 `" && echo ansible-tmp-1727096131.0230691-12387-76108933692235="` echo /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235 `" ) && sleep 0' 11701 1727096131.02799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.02803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.02805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096131.02816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096131.02818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.02864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.02871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.02873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.02915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.04942: stdout chunk (state=3): >>>ansible-tmp-1727096131.0230691-12387-76108933692235=/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235 <<< 11701 1727096131.05040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.05073: stderr chunk (state=3): >>><<< 11701 1727096131.05076: stdout chunk (state=3): >>><<< 11701 1727096131.05098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096131.0230691-12387-76108933692235=/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.05139: variable 'ansible_module_compression' from source: unknown 11701 1727096131.05180: ANSIBALLZ: Using lock for ping 11701 1727096131.05183: ANSIBALLZ: Acquiring lock 11701 1727096131.05185: ANSIBALLZ: Lock acquired: 139907423291152 11701 1727096131.05188: ANSIBALLZ: Creating module 11701 1727096131.12964: ANSIBALLZ: Writing module into payload 11701 1727096131.13007: ANSIBALLZ: Writing module 11701 1727096131.13027: ANSIBALLZ: Renaming module 11701 1727096131.13033: ANSIBALLZ: Done creating module 11701 1727096131.13051: variable 'ansible_facts' from source: unknown 11701 1727096131.13098: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py 11701 1727096131.13203: Sending initial data 11701 1727096131.13206: Sent initial data (152 bytes) 11701 1727096131.13678: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.13682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.13684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.13727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.13739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.13790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.15457: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096131.15484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096131.15517: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpyb25ru9s /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py <<< 11701 1727096131.15534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py" <<< 11701 1727096131.15554: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpyb25ru9s" to remote "/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py" <<< 11701 1727096131.15557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py" <<< 11701 1727096131.16036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.16086: stderr chunk (state=3): >>><<< 11701 1727096131.16089: stdout chunk (state=3): >>><<< 11701 1727096131.16128: done transferring module to remote 11701 1727096131.16137: _low_level_execute_command(): starting 11701 1727096131.16142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/ /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py && sleep 0' 11701 1727096131.16611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096131.16615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096131.16617: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.16619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096131.16625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096131.16628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.16676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.16679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.16682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.16718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.18541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.18575: stderr chunk (state=3): >>><<< 11701 1727096131.18578: stdout chunk (state=3): >>><<< 11701 1727096131.18590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.18592: _low_level_execute_command(): starting 11701 1727096131.18598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/AnsiballZ_ping.py && sleep 0' 11701 1727096131.19049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096131.19055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096131.19057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.19059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.19061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.19114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.19122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.19129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.19160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.34813: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11701 1727096131.36205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096131.36209: stdout chunk (state=3): >>><<< 11701 1727096131.36212: stderr chunk (state=3): >>><<< 11701 1727096131.36231: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096131.36260: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096131.36313: _low_level_execute_command(): starting 11701 1727096131.36316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096131.0230691-12387-76108933692235/ > /dev/null 2>&1 && sleep 0' 11701 1727096131.36917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096131.36932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096131.36944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.36962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096131.36982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096131.36992: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096131.37035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.37112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.37160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.37195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.39286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.39291: stdout chunk (state=3): >>><<< 11701 1727096131.39293: stderr chunk (state=3): >>><<< 11701 1727096131.39296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.39298: handler run complete 11701 1727096131.39300: attempt loop complete, returning result 11701 1727096131.39302: _execute() done 11701 1727096131.39305: dumping result to json 11701 1727096131.39307: done dumping result, returning 11701 1727096131.39309: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a05c-c957-00000000003b] 11701 1727096131.39311: sending task result for task 0afff68d-5257-a05c-c957-00000000003b 11701 1727096131.39385: done sending task result for task 0afff68d-5257-a05c-c957-00000000003b 11701 1727096131.39389: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11701 1727096131.39452: no more pending results, returning what we have 11701 1727096131.39456: results queue empty 11701 1727096131.39457: checking for any_errors_fatal 11701 1727096131.39464: done checking for any_errors_fatal 11701 1727096131.39465: checking for max_fail_percentage 11701 1727096131.39468: done checking for max_fail_percentage 11701 1727096131.39469: checking to see if all hosts have failed and the running result is not ok 11701 1727096131.39471: done checking to see if all hosts have failed 11701 1727096131.39472: getting the remaining hosts for this loop 11701 1727096131.39473: done getting the remaining hosts for this loop 11701 1727096131.39477: getting the next task for host managed_node3 11701 1727096131.39487: done getting next task for host managed_node3 11701 1727096131.39490: ^ task is: TASK: meta (role_complete) 11701 1727096131.39493: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096131.39579: getting variables 11701 1727096131.39582: in VariableManager get_vars() 11701 1727096131.39635: Calling all_inventory to load vars for managed_node3 11701 1727096131.39638: Calling groups_inventory to load vars for managed_node3 11701 1727096131.39641: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096131.39652: Calling all_plugins_play to load vars for managed_node3 11701 1727096131.39656: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096131.39659: Calling groups_plugins_play to load vars for managed_node3 11701 1727096131.41431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096131.43030: done with get_vars() 11701 1727096131.43063: done getting variables 11701 1727096131.43158: done queuing things up, now waiting for results queue to drain 11701 1727096131.43160: results queue empty 11701 1727096131.43161: checking for any_errors_fatal 11701 1727096131.43164: done checking for any_errors_fatal 11701 1727096131.43165: checking for max_fail_percentage 11701 1727096131.43166: done checking for max_fail_percentage 11701 1727096131.43167: checking to see if all hosts have failed and the running result is not ok 11701 1727096131.43169: done checking to see if all hosts have failed 11701 1727096131.43170: getting the remaining hosts for this loop 11701 1727096131.43171: done getting the remaining hosts for this loop 11701 1727096131.43173: getting the next task for host managed_node3 11701 1727096131.43179: done getting next task for host managed_node3 11701 1727096131.43181: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11701 1727096131.43183: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096131.43186: getting variables 11701 1727096131.43187: in VariableManager get_vars() 11701 1727096131.43202: Calling all_inventory to load vars for managed_node3 11701 1727096131.43204: Calling groups_inventory to load vars for managed_node3 11701 1727096131.43206: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096131.43211: Calling all_plugins_play to load vars for managed_node3 11701 1727096131.43213: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096131.43216: Calling groups_plugins_play to load vars for managed_node3 11701 1727096131.44325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096131.45870: done with get_vars() 11701 1727096131.45902: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:31 -0400 (0:00:00.478) 0:00:15.424 ****** 11701 1727096131.45989: entering _queue_task() for managed_node3/include_tasks 11701 1727096131.46356: worker is 1 (out of 1 available) 11701 1727096131.46372: exiting _queue_task() for managed_node3/include_tasks 11701 1727096131.46383: done queuing things up, now waiting for results queue to drain 11701 1727096131.46384: waiting for pending results... 11701 1727096131.46776: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11701 1727096131.46785: in run() - task 0afff68d-5257-a05c-c957-00000000006e 11701 1727096131.46806: variable 'ansible_search_path' from source: unknown 11701 1727096131.46813: variable 'ansible_search_path' from source: unknown 11701 1727096131.46857: calling self._execute() 11701 1727096131.46961: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.46978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.46991: variable 'omit' from source: magic vars 11701 1727096131.47376: variable 'ansible_distribution_major_version' from source: facts 11701 1727096131.47395: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096131.47517: _execute() done 11701 1727096131.47520: dumping result to json 11701 1727096131.47522: done dumping result, returning 11701 1727096131.47524: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a05c-c957-00000000006e] 11701 1727096131.47526: sending task result for task 0afff68d-5257-a05c-c957-00000000006e 11701 1727096131.47598: done sending task result for task 0afff68d-5257-a05c-c957-00000000006e 11701 1727096131.47602: WORKER PROCESS EXITING 11701 1727096131.47645: no more pending results, returning what we have 11701 1727096131.47654: in VariableManager get_vars() 11701 1727096131.47706: Calling all_inventory to load vars for managed_node3 11701 1727096131.47709: Calling groups_inventory to load vars for managed_node3 11701 1727096131.47712: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096131.47726: Calling all_plugins_play to load vars for managed_node3 11701 1727096131.47729: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096131.47732: Calling groups_plugins_play to load vars for managed_node3 11701 1727096131.49394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096131.50908: done with get_vars() 11701 1727096131.50939: variable 'ansible_search_path' from source: unknown 11701 1727096131.50940: variable 'ansible_search_path' from source: unknown 11701 1727096131.50987: we have included files to process 11701 1727096131.50988: generating all_blocks data 11701 1727096131.50990: done generating all_blocks data 11701 1727096131.50996: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096131.50997: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096131.50999: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11701 1727096131.51196: done processing included file 11701 1727096131.51198: iterating over new_blocks loaded from include file 11701 1727096131.51200: in VariableManager get_vars() 11701 1727096131.51221: done with get_vars() 11701 1727096131.51223: filtering new block on tags 11701 1727096131.51241: done filtering new block on tags 11701 1727096131.51243: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11701 1727096131.51248: extending task lists for all hosts with included blocks 11701 1727096131.51356: done extending task lists 11701 1727096131.51358: done processing included files 11701 1727096131.51359: results queue empty 11701 1727096131.51359: checking for any_errors_fatal 11701 1727096131.51361: done checking for any_errors_fatal 11701 1727096131.51362: checking for max_fail_percentage 11701 1727096131.51363: done checking for max_fail_percentage 11701 1727096131.51363: checking to see if all hosts have failed and the running result is not ok 11701 1727096131.51364: done checking to see if all hosts have failed 11701 1727096131.51365: getting the remaining hosts for this loop 11701 1727096131.51366: done getting the remaining hosts for this loop 11701 1727096131.51370: getting the next task for host managed_node3 11701 1727096131.51375: done getting next task for host managed_node3 11701 1727096131.51377: ^ task is: TASK: Get stat for interface {{ interface }} 11701 1727096131.51380: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096131.51382: getting variables 11701 1727096131.51383: in VariableManager get_vars() 11701 1727096131.51397: Calling all_inventory to load vars for managed_node3 11701 1727096131.51399: Calling groups_inventory to load vars for managed_node3 11701 1727096131.51401: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096131.51407: Calling all_plugins_play to load vars for managed_node3 11701 1727096131.51409: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096131.51412: Calling groups_plugins_play to load vars for managed_node3 11701 1727096131.52604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096131.55194: done with get_vars() 11701 1727096131.55222: done getting variables 11701 1727096131.55530: variable 'interface' from source: task vars 11701 1727096131.55534: variable 'controller_device' from source: play vars 11701 1727096131.55596: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:31 -0400 (0:00:00.096) 0:00:15.520 ****** 11701 1727096131.55628: entering _queue_task() for managed_node3/stat 11701 1727096131.55987: worker is 1 (out of 1 available) 11701 1727096131.55999: exiting _queue_task() for managed_node3/stat 11701 1727096131.56010: done queuing things up, now waiting for results queue to drain 11701 1727096131.56011: waiting for pending results... 11701 1727096131.56389: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 11701 1727096131.56492: in run() - task 0afff68d-5257-a05c-c957-000000000241 11701 1727096131.56512: variable 'ansible_search_path' from source: unknown 11701 1727096131.56518: variable 'ansible_search_path' from source: unknown 11701 1727096131.56559: calling self._execute() 11701 1727096131.56660: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.56697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.56739: variable 'omit' from source: magic vars 11701 1727096131.57128: variable 'ansible_distribution_major_version' from source: facts 11701 1727096131.57154: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096131.57249: variable 'omit' from source: magic vars 11701 1727096131.57255: variable 'omit' from source: magic vars 11701 1727096131.57339: variable 'interface' from source: task vars 11701 1727096131.57347: variable 'controller_device' from source: play vars 11701 1727096131.57420: variable 'controller_device' from source: play vars 11701 1727096131.57446: variable 'omit' from source: magic vars 11701 1727096131.57503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096131.57543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096131.57576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096131.57599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096131.57614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096131.57647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096131.57658: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.57665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.57778: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096131.57793: Set connection var ansible_timeout to 10 11701 1727096131.57899: Set connection var ansible_shell_type to sh 11701 1727096131.57902: Set connection var ansible_shell_executable to /bin/sh 11701 1727096131.57904: Set connection var ansible_connection to ssh 11701 1727096131.57907: Set connection var ansible_pipelining to False 11701 1727096131.57909: variable 'ansible_shell_executable' from source: unknown 11701 1727096131.57911: variable 'ansible_connection' from source: unknown 11701 1727096131.57913: variable 'ansible_module_compression' from source: unknown 11701 1727096131.57915: variable 'ansible_shell_type' from source: unknown 11701 1727096131.57917: variable 'ansible_shell_executable' from source: unknown 11701 1727096131.57919: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.57921: variable 'ansible_pipelining' from source: unknown 11701 1727096131.57924: variable 'ansible_timeout' from source: unknown 11701 1727096131.57926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.58112: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096131.58172: variable 'omit' from source: magic vars 11701 1727096131.58175: starting attempt loop 11701 1727096131.58178: running the handler 11701 1727096131.58180: _low_level_execute_command(): starting 11701 1727096131.58182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096131.58994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.59080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.59233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.59363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.61139: stdout chunk (state=3): >>>/root <<< 11701 1727096131.61156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.61193: stderr chunk (state=3): >>><<< 11701 1727096131.61220: stdout chunk (state=3): >>><<< 11701 1727096131.61499: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.61503: _low_level_execute_command(): starting 11701 1727096131.61506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397 `" && echo ansible-tmp-1727096131.614008-12409-71142601482397="` echo /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397 `" ) && sleep 0' 11701 1727096131.62486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.62987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.64907: stdout chunk (state=3): >>>ansible-tmp-1727096131.614008-12409-71142601482397=/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397 <<< 11701 1727096131.65187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.65191: stdout chunk (state=3): >>><<< 11701 1727096131.65194: stderr chunk (state=3): >>><<< 11701 1727096131.65261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096131.614008-12409-71142601482397=/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.65305: variable 'ansible_module_compression' from source: unknown 11701 1727096131.65499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096131.65542: variable 'ansible_facts' from source: unknown 11701 1727096131.65811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py 11701 1727096131.66152: Sending initial data 11701 1727096131.66170: Sent initial data (151 bytes) 11701 1727096131.66870: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096131.67017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.67049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.67079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.67098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.67199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.68876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096131.68919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096131.68948: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp6ujqehnt /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py <<< 11701 1727096131.68958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py" <<< 11701 1727096131.68977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp6ujqehnt" to remote "/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py" <<< 11701 1727096131.69483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.69524: stderr chunk (state=3): >>><<< 11701 1727096131.69533: stdout chunk (state=3): >>><<< 11701 1727096131.69590: done transferring module to remote 11701 1727096131.69599: _low_level_execute_command(): starting 11701 1727096131.69608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/ /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py && sleep 0' 11701 1727096131.70229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096131.70244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096131.70261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096131.70287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096131.70308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096131.70386: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.70422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.70439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.70465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.70573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.72427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.72470: stderr chunk (state=3): >>><<< 11701 1727096131.72473: stdout chunk (state=3): >>><<< 11701 1727096131.72486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.72489: _low_level_execute_command(): starting 11701 1727096131.72494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/AnsiballZ_stat.py && sleep 0' 11701 1727096131.73125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.73182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.73234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.89014: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1727096130.6095154, "mtime": 1727096130.6095154, "ctime": 1727096130.6095154, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096131.90582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096131.90586: stdout chunk (state=3): >>><<< 11701 1727096131.90589: stderr chunk (state=3): >>><<< 11701 1727096131.90591: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1727096130.6095154, "mtime": 1727096130.6095154, "ctime": 1727096130.6095154, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096131.90593: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096131.90595: _low_level_execute_command(): starting 11701 1727096131.90598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096131.614008-12409-71142601482397/ > /dev/null 2>&1 && sleep 0' 11701 1727096131.91278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096131.91294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096131.91380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096131.91407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096131.91431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096131.91503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096131.93401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096131.93476: stderr chunk (state=3): >>><<< 11701 1727096131.93485: stdout chunk (state=3): >>><<< 11701 1727096131.93506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096131.93517: handler run complete 11701 1727096131.93586: attempt loop complete, returning result 11701 1727096131.93594: _execute() done 11701 1727096131.93602: dumping result to json 11701 1727096131.93612: done dumping result, returning 11701 1727096131.93624: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0afff68d-5257-a05c-c957-000000000241] 11701 1727096131.93632: sending task result for task 0afff68d-5257-a05c-c957-000000000241 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096130.6095154, "block_size": 4096, "blocks": 0, "ctime": 1727096130.6095154, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28290, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727096130.6095154, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11701 1727096131.94066: no more pending results, returning what we have 11701 1727096131.94072: results queue empty 11701 1727096131.94073: checking for any_errors_fatal 11701 1727096131.94075: done checking for any_errors_fatal 11701 1727096131.94075: checking for max_fail_percentage 11701 1727096131.94077: done checking for max_fail_percentage 11701 1727096131.94078: checking to see if all hosts have failed and the running result is not ok 11701 1727096131.94079: done checking to see if all hosts have failed 11701 1727096131.94079: getting the remaining hosts for this loop 11701 1727096131.94081: done getting the remaining hosts for this loop 11701 1727096131.94084: getting the next task for host managed_node3 11701 1727096131.94093: done getting next task for host managed_node3 11701 1727096131.94096: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11701 1727096131.94099: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096131.94104: getting variables 11701 1727096131.94105: in VariableManager get_vars() 11701 1727096131.94144: Calling all_inventory to load vars for managed_node3 11701 1727096131.94147: Calling groups_inventory to load vars for managed_node3 11701 1727096131.94152: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096131.94163: Calling all_plugins_play to load vars for managed_node3 11701 1727096131.94166: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096131.94287: done sending task result for task 0afff68d-5257-a05c-c957-000000000241 11701 1727096131.94291: WORKER PROCESS EXITING 11701 1727096131.94296: Calling groups_plugins_play to load vars for managed_node3 11701 1727096131.95934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096131.97133: done with get_vars() 11701 1727096131.97158: done getting variables 11701 1727096131.97206: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096131.97297: variable 'interface' from source: task vars 11701 1727096131.97300: variable 'controller_device' from source: play vars 11701 1727096131.97343: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:31 -0400 (0:00:00.417) 0:00:15.938 ****** 11701 1727096131.97369: entering _queue_task() for managed_node3/assert 11701 1727096131.97638: worker is 1 (out of 1 available) 11701 1727096131.97651: exiting _queue_task() for managed_node3/assert 11701 1727096131.97664: done queuing things up, now waiting for results queue to drain 11701 1727096131.97666: waiting for pending results... 11701 1727096131.97845: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 11701 1727096131.97931: in run() - task 0afff68d-5257-a05c-c957-00000000006f 11701 1727096131.97941: variable 'ansible_search_path' from source: unknown 11701 1727096131.97946: variable 'ansible_search_path' from source: unknown 11701 1727096131.97978: calling self._execute() 11701 1727096131.98102: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.98107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.98110: variable 'omit' from source: magic vars 11701 1727096131.98600: variable 'ansible_distribution_major_version' from source: facts 11701 1727096131.98604: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096131.98606: variable 'omit' from source: magic vars 11701 1727096131.98609: variable 'omit' from source: magic vars 11701 1727096131.98773: variable 'interface' from source: task vars 11701 1727096131.99279: variable 'controller_device' from source: play vars 11701 1727096131.99282: variable 'controller_device' from source: play vars 11701 1727096131.99284: variable 'omit' from source: magic vars 11701 1727096131.99286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096131.99289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096131.99291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096131.99480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096131.99496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096131.99527: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096131.99533: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.99539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096131.99722: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096131.99748: Set connection var ansible_timeout to 10 11701 1727096131.99766: Set connection var ansible_shell_type to sh 11701 1727096131.99789: Set connection var ansible_shell_executable to /bin/sh 11701 1727096131.99797: Set connection var ansible_connection to ssh 11701 1727096131.99883: Set connection var ansible_pipelining to False 11701 1727096131.99909: variable 'ansible_shell_executable' from source: unknown 11701 1727096131.99917: variable 'ansible_connection' from source: unknown 11701 1727096131.99924: variable 'ansible_module_compression' from source: unknown 11701 1727096131.99930: variable 'ansible_shell_type' from source: unknown 11701 1727096131.99935: variable 'ansible_shell_executable' from source: unknown 11701 1727096131.99941: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096131.99948: variable 'ansible_pipelining' from source: unknown 11701 1727096131.99954: variable 'ansible_timeout' from source: unknown 11701 1727096131.99961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.00100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096132.00478: variable 'omit' from source: magic vars 11701 1727096132.00481: starting attempt loop 11701 1727096132.00484: running the handler 11701 1727096132.00486: variable 'interface_stat' from source: set_fact 11701 1727096132.00489: Evaluated conditional (interface_stat.stat.exists): True 11701 1727096132.00492: handler run complete 11701 1727096132.00495: attempt loop complete, returning result 11701 1727096132.00497: _execute() done 11701 1727096132.00500: dumping result to json 11701 1727096132.00577: done dumping result, returning 11701 1727096132.00589: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0afff68d-5257-a05c-c957-00000000006f] 11701 1727096132.00598: sending task result for task 0afff68d-5257-a05c-c957-00000000006f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096132.00773: no more pending results, returning what we have 11701 1727096132.00777: results queue empty 11701 1727096132.00778: checking for any_errors_fatal 11701 1727096132.00788: done checking for any_errors_fatal 11701 1727096132.00788: checking for max_fail_percentage 11701 1727096132.00790: done checking for max_fail_percentage 11701 1727096132.00791: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.00792: done checking to see if all hosts have failed 11701 1727096132.00793: getting the remaining hosts for this loop 11701 1727096132.00794: done getting the remaining hosts for this loop 11701 1727096132.00798: getting the next task for host managed_node3 11701 1727096132.00805: done getting next task for host managed_node3 11701 1727096132.00809: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11701 1727096132.00811: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.00815: getting variables 11701 1727096132.00817: in VariableManager get_vars() 11701 1727096132.00860: Calling all_inventory to load vars for managed_node3 11701 1727096132.00863: Calling groups_inventory to load vars for managed_node3 11701 1727096132.00865: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.00879: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.00883: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.00886: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.01610: done sending task result for task 0afff68d-5257-a05c-c957-00000000006f 11701 1727096132.01614: WORKER PROCESS EXITING 11701 1727096132.02692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.04977: done with get_vars() 11701 1727096132.05007: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Monday 23 September 2024 08:55:32 -0400 (0:00:00.077) 0:00:16.015 ****** 11701 1727096132.05101: entering _queue_task() for managed_node3/include_tasks 11701 1727096132.05458: worker is 1 (out of 1 available) 11701 1727096132.05476: exiting _queue_task() for managed_node3/include_tasks 11701 1727096132.05488: done queuing things up, now waiting for results queue to drain 11701 1727096132.05489: waiting for pending results... 11701 1727096132.05717: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 11701 1727096132.05820: in run() - task 0afff68d-5257-a05c-c957-000000000070 11701 1727096132.05841: variable 'ansible_search_path' from source: unknown 11701 1727096132.05900: variable 'controller_profile' from source: play vars 11701 1727096132.06092: variable 'controller_profile' from source: play vars 11701 1727096132.06112: variable 'port1_profile' from source: play vars 11701 1727096132.06186: variable 'port1_profile' from source: play vars 11701 1727096132.06202: variable 'port2_profile' from source: play vars 11701 1727096132.06271: variable 'port2_profile' from source: play vars 11701 1727096132.06291: variable 'omit' from source: magic vars 11701 1727096132.06437: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.06454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.06472: variable 'omit' from source: magic vars 11701 1727096132.06714: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.06729: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.06773: variable 'item' from source: unknown 11701 1727096132.06837: variable 'item' from source: unknown 11701 1727096132.07088: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.07091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.07094: variable 'omit' from source: magic vars 11701 1727096132.07284: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.07287: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.07289: variable 'item' from source: unknown 11701 1727096132.07333: variable 'item' from source: unknown 11701 1727096132.07525: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.07529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.07531: variable 'omit' from source: magic vars 11701 1727096132.07658: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.07671: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.07702: variable 'item' from source: unknown 11701 1727096132.07830: variable 'item' from source: unknown 11701 1727096132.07941: dumping result to json 11701 1727096132.07944: done dumping result, returning 11701 1727096132.07947: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-a05c-c957-000000000070] 11701 1727096132.07949: sending task result for task 0afff68d-5257-a05c-c957-000000000070 11701 1727096132.08108: no more pending results, returning what we have 11701 1727096132.08114: in VariableManager get_vars() 11701 1727096132.08171: Calling all_inventory to load vars for managed_node3 11701 1727096132.08175: Calling groups_inventory to load vars for managed_node3 11701 1727096132.08178: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.08195: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.08198: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.08202: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.08781: done sending task result for task 0afff68d-5257-a05c-c957-000000000070 11701 1727096132.08785: WORKER PROCESS EXITING 11701 1727096132.09839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.10931: done with get_vars() 11701 1727096132.10948: variable 'ansible_search_path' from source: unknown 11701 1727096132.10964: variable 'ansible_search_path' from source: unknown 11701 1727096132.10971: variable 'ansible_search_path' from source: unknown 11701 1727096132.10976: we have included files to process 11701 1727096132.10977: generating all_blocks data 11701 1727096132.10979: done generating all_blocks data 11701 1727096132.10981: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.10982: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.10984: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11118: in VariableManager get_vars() 11701 1727096132.11134: done with get_vars() 11701 1727096132.11309: done processing included file 11701 1727096132.11310: iterating over new_blocks loaded from include file 11701 1727096132.11311: in VariableManager get_vars() 11701 1727096132.11323: done with get_vars() 11701 1727096132.11324: filtering new block on tags 11701 1727096132.11337: done filtering new block on tags 11701 1727096132.11338: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 11701 1727096132.11342: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11343: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11345: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11407: in VariableManager get_vars() 11701 1727096132.11422: done with get_vars() 11701 1727096132.11574: done processing included file 11701 1727096132.11576: iterating over new_blocks loaded from include file 11701 1727096132.11577: in VariableManager get_vars() 11701 1727096132.11587: done with get_vars() 11701 1727096132.11588: filtering new block on tags 11701 1727096132.11600: done filtering new block on tags 11701 1727096132.11602: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 11701 1727096132.11604: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11605: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11607: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11701 1727096132.11666: in VariableManager get_vars() 11701 1727096132.11719: done with get_vars() 11701 1727096132.11875: done processing included file 11701 1727096132.11876: iterating over new_blocks loaded from include file 11701 1727096132.11877: in VariableManager get_vars() 11701 1727096132.11888: done with get_vars() 11701 1727096132.11889: filtering new block on tags 11701 1727096132.11899: done filtering new block on tags 11701 1727096132.11900: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 11701 1727096132.11903: extending task lists for all hosts with included blocks 11701 1727096132.13919: done extending task lists 11701 1727096132.13925: done processing included files 11701 1727096132.13926: results queue empty 11701 1727096132.13927: checking for any_errors_fatal 11701 1727096132.13929: done checking for any_errors_fatal 11701 1727096132.13930: checking for max_fail_percentage 11701 1727096132.13931: done checking for max_fail_percentage 11701 1727096132.13931: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.13932: done checking to see if all hosts have failed 11701 1727096132.13932: getting the remaining hosts for this loop 11701 1727096132.13933: done getting the remaining hosts for this loop 11701 1727096132.13935: getting the next task for host managed_node3 11701 1727096132.13937: done getting next task for host managed_node3 11701 1727096132.13939: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11701 1727096132.13941: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.13942: getting variables 11701 1727096132.13943: in VariableManager get_vars() 11701 1727096132.13959: Calling all_inventory to load vars for managed_node3 11701 1727096132.13961: Calling groups_inventory to load vars for managed_node3 11701 1727096132.13963: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.13969: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.13971: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.13973: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.18404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.19754: done with get_vars() 11701 1727096132.19784: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:32 -0400 (0:00:00.147) 0:00:16.163 ****** 11701 1727096132.19862: entering _queue_task() for managed_node3/include_tasks 11701 1727096132.20216: worker is 1 (out of 1 available) 11701 1727096132.20230: exiting _queue_task() for managed_node3/include_tasks 11701 1727096132.20242: done queuing things up, now waiting for results queue to drain 11701 1727096132.20243: waiting for pending results... 11701 1727096132.20531: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11701 1727096132.20605: in run() - task 0afff68d-5257-a05c-c957-00000000025f 11701 1727096132.20622: variable 'ansible_search_path' from source: unknown 11701 1727096132.20625: variable 'ansible_search_path' from source: unknown 11701 1727096132.20663: calling self._execute() 11701 1727096132.20740: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.20747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.20757: variable 'omit' from source: magic vars 11701 1727096132.21039: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.21052: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.21057: _execute() done 11701 1727096132.21059: dumping result to json 11701 1727096132.21062: done dumping result, returning 11701 1727096132.21071: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-a05c-c957-00000000025f] 11701 1727096132.21073: sending task result for task 0afff68d-5257-a05c-c957-00000000025f 11701 1727096132.21155: done sending task result for task 0afff68d-5257-a05c-c957-00000000025f 11701 1727096132.21158: WORKER PROCESS EXITING 11701 1727096132.21186: no more pending results, returning what we have 11701 1727096132.21192: in VariableManager get_vars() 11701 1727096132.21241: Calling all_inventory to load vars for managed_node3 11701 1727096132.21243: Calling groups_inventory to load vars for managed_node3 11701 1727096132.21246: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.21262: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.21264: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.21269: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.22065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.23496: done with get_vars() 11701 1727096132.23510: variable 'ansible_search_path' from source: unknown 11701 1727096132.23511: variable 'ansible_search_path' from source: unknown 11701 1727096132.23537: we have included files to process 11701 1727096132.23538: generating all_blocks data 11701 1727096132.23539: done generating all_blocks data 11701 1727096132.23541: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096132.23541: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096132.23543: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096132.24231: done processing included file 11701 1727096132.24233: iterating over new_blocks loaded from include file 11701 1727096132.24234: in VariableManager get_vars() 11701 1727096132.24248: done with get_vars() 11701 1727096132.24249: filtering new block on tags 11701 1727096132.24266: done filtering new block on tags 11701 1727096132.24269: in VariableManager get_vars() 11701 1727096132.24282: done with get_vars() 11701 1727096132.24283: filtering new block on tags 11701 1727096132.24295: done filtering new block on tags 11701 1727096132.24296: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11701 1727096132.24300: extending task lists for all hosts with included blocks 11701 1727096132.24447: done extending task lists 11701 1727096132.24448: done processing included files 11701 1727096132.24449: results queue empty 11701 1727096132.24449: checking for any_errors_fatal 11701 1727096132.24454: done checking for any_errors_fatal 11701 1727096132.24455: checking for max_fail_percentage 11701 1727096132.24455: done checking for max_fail_percentage 11701 1727096132.24456: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.24456: done checking to see if all hosts have failed 11701 1727096132.24457: getting the remaining hosts for this loop 11701 1727096132.24458: done getting the remaining hosts for this loop 11701 1727096132.24459: getting the next task for host managed_node3 11701 1727096132.24462: done getting next task for host managed_node3 11701 1727096132.24464: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096132.24466: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.24468: getting variables 11701 1727096132.24469: in VariableManager get_vars() 11701 1727096132.24479: Calling all_inventory to load vars for managed_node3 11701 1727096132.24480: Calling groups_inventory to load vars for managed_node3 11701 1727096132.24481: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.24486: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.24487: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.24489: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.25110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.25971: done with get_vars() 11701 1727096132.25990: done getting variables 11701 1727096132.26021: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:32 -0400 (0:00:00.061) 0:00:16.225 ****** 11701 1727096132.26042: entering _queue_task() for managed_node3/set_fact 11701 1727096132.26309: worker is 1 (out of 1 available) 11701 1727096132.26321: exiting _queue_task() for managed_node3/set_fact 11701 1727096132.26334: done queuing things up, now waiting for results queue to drain 11701 1727096132.26335: waiting for pending results... 11701 1727096132.26514: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096132.26584: in run() - task 0afff68d-5257-a05c-c957-0000000003b0 11701 1727096132.26597: variable 'ansible_search_path' from source: unknown 11701 1727096132.26600: variable 'ansible_search_path' from source: unknown 11701 1727096132.26629: calling self._execute() 11701 1727096132.26707: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.26711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.26720: variable 'omit' from source: magic vars 11701 1727096132.26995: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.27008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.27012: variable 'omit' from source: magic vars 11701 1727096132.27047: variable 'omit' from source: magic vars 11701 1727096132.27073: variable 'omit' from source: magic vars 11701 1727096132.27107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096132.27136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096132.27155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096132.27169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.27180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.27203: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096132.27206: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.27209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.27281: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096132.27285: Set connection var ansible_timeout to 10 11701 1727096132.27288: Set connection var ansible_shell_type to sh 11701 1727096132.27293: Set connection var ansible_shell_executable to /bin/sh 11701 1727096132.27295: Set connection var ansible_connection to ssh 11701 1727096132.27302: Set connection var ansible_pipelining to False 11701 1727096132.27317: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.27320: variable 'ansible_connection' from source: unknown 11701 1727096132.27323: variable 'ansible_module_compression' from source: unknown 11701 1727096132.27327: variable 'ansible_shell_type' from source: unknown 11701 1727096132.27330: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.27332: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.27335: variable 'ansible_pipelining' from source: unknown 11701 1727096132.27337: variable 'ansible_timeout' from source: unknown 11701 1727096132.27339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.27438: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096132.27445: variable 'omit' from source: magic vars 11701 1727096132.27452: starting attempt loop 11701 1727096132.27461: running the handler 11701 1727096132.27473: handler run complete 11701 1727096132.27481: attempt loop complete, returning result 11701 1727096132.27483: _execute() done 11701 1727096132.27486: dumping result to json 11701 1727096132.27488: done dumping result, returning 11701 1727096132.27495: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-a05c-c957-0000000003b0] 11701 1727096132.27497: sending task result for task 0afff68d-5257-a05c-c957-0000000003b0 11701 1727096132.27578: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b0 11701 1727096132.27581: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11701 1727096132.27633: no more pending results, returning what we have 11701 1727096132.27636: results queue empty 11701 1727096132.27637: checking for any_errors_fatal 11701 1727096132.27639: done checking for any_errors_fatal 11701 1727096132.27639: checking for max_fail_percentage 11701 1727096132.27641: done checking for max_fail_percentage 11701 1727096132.27642: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.27643: done checking to see if all hosts have failed 11701 1727096132.27644: getting the remaining hosts for this loop 11701 1727096132.27645: done getting the remaining hosts for this loop 11701 1727096132.27648: getting the next task for host managed_node3 11701 1727096132.27658: done getting next task for host managed_node3 11701 1727096132.27660: ^ task is: TASK: Stat profile file 11701 1727096132.27666: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.27678: getting variables 11701 1727096132.27680: in VariableManager get_vars() 11701 1727096132.27721: Calling all_inventory to load vars for managed_node3 11701 1727096132.27724: Calling groups_inventory to load vars for managed_node3 11701 1727096132.27726: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.27736: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.27739: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.27741: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.28604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.29472: done with get_vars() 11701 1727096132.29488: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:32 -0400 (0:00:00.035) 0:00:16.260 ****** 11701 1727096132.29559: entering _queue_task() for managed_node3/stat 11701 1727096132.29812: worker is 1 (out of 1 available) 11701 1727096132.29824: exiting _queue_task() for managed_node3/stat 11701 1727096132.29836: done queuing things up, now waiting for results queue to drain 11701 1727096132.29837: waiting for pending results... 11701 1727096132.30012: running TaskExecutor() for managed_node3/TASK: Stat profile file 11701 1727096132.30080: in run() - task 0afff68d-5257-a05c-c957-0000000003b1 11701 1727096132.30093: variable 'ansible_search_path' from source: unknown 11701 1727096132.30096: variable 'ansible_search_path' from source: unknown 11701 1727096132.30124: calling self._execute() 11701 1727096132.30201: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.30205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.30215: variable 'omit' from source: magic vars 11701 1727096132.30482: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.30492: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.30499: variable 'omit' from source: magic vars 11701 1727096132.30533: variable 'omit' from source: magic vars 11701 1727096132.30602: variable 'profile' from source: include params 11701 1727096132.30606: variable 'item' from source: include params 11701 1727096132.30657: variable 'item' from source: include params 11701 1727096132.30671: variable 'omit' from source: magic vars 11701 1727096132.30705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096132.30733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096132.30749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096132.30763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.30775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.30798: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096132.30801: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.30803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.30877: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096132.30881: Set connection var ansible_timeout to 10 11701 1727096132.30883: Set connection var ansible_shell_type to sh 11701 1727096132.30889: Set connection var ansible_shell_executable to /bin/sh 11701 1727096132.30892: Set connection var ansible_connection to ssh 11701 1727096132.30900: Set connection var ansible_pipelining to False 11701 1727096132.30916: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.30919: variable 'ansible_connection' from source: unknown 11701 1727096132.30921: variable 'ansible_module_compression' from source: unknown 11701 1727096132.30924: variable 'ansible_shell_type' from source: unknown 11701 1727096132.30926: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.30928: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.30932: variable 'ansible_pipelining' from source: unknown 11701 1727096132.30935: variable 'ansible_timeout' from source: unknown 11701 1727096132.30939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.31083: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096132.31091: variable 'omit' from source: magic vars 11701 1727096132.31096: starting attempt loop 11701 1727096132.31099: running the handler 11701 1727096132.31110: _low_level_execute_command(): starting 11701 1727096132.31117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096132.31645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.31649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.31652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.31654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096132.31657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.31712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.31715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.31717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.31766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.33441: stdout chunk (state=3): >>>/root <<< 11701 1727096132.33530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.33566: stderr chunk (state=3): >>><<< 11701 1727096132.33572: stdout chunk (state=3): >>><<< 11701 1727096132.33593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.33605: _low_level_execute_command(): starting 11701 1727096132.33610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839 `" && echo ansible-tmp-1727096132.3359344-12443-56120930772839="` echo /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839 `" ) && sleep 0' 11701 1727096132.34063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.34066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.34077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096132.34080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.34128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.34132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.34136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.34172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.36103: stdout chunk (state=3): >>>ansible-tmp-1727096132.3359344-12443-56120930772839=/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839 <<< 11701 1727096132.36265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.36272: stdout chunk (state=3): >>><<< 11701 1727096132.36275: stderr chunk (state=3): >>><<< 11701 1727096132.36296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096132.3359344-12443-56120930772839=/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.36377: variable 'ansible_module_compression' from source: unknown 11701 1727096132.36432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096132.36492: variable 'ansible_facts' from source: unknown 11701 1727096132.36602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py 11701 1727096132.36796: Sending initial data 11701 1727096132.36799: Sent initial data (152 bytes) 11701 1727096132.37459: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096132.37585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.37615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.37693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.39330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096132.39356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096132.39388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp85ny0wxn /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py <<< 11701 1727096132.39398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py" <<< 11701 1727096132.39419: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp85ny0wxn" to remote "/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py" <<< 11701 1727096132.39423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py" <<< 11701 1727096132.39906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.39950: stderr chunk (state=3): >>><<< 11701 1727096132.39953: stdout chunk (state=3): >>><<< 11701 1727096132.39991: done transferring module to remote 11701 1727096132.40002: _low_level_execute_command(): starting 11701 1727096132.40006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/ /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py && sleep 0' 11701 1727096132.40438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.40475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096132.40478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096132.40480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.40482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.40484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.40533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.40541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.40575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.42419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.42444: stderr chunk (state=3): >>><<< 11701 1727096132.42447: stdout chunk (state=3): >>><<< 11701 1727096132.42463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.42466: _low_level_execute_command(): starting 11701 1727096132.42473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/AnsiballZ_stat.py && sleep 0' 11701 1727096132.42924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.42928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096132.42930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096132.42933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096132.42935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.42985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.42993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.42998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.43032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.58523: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096132.60192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096132.60196: stdout chunk (state=3): >>><<< 11701 1727096132.60198: stderr chunk (state=3): >>><<< 11701 1727096132.60201: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096132.60203: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096132.60205: _low_level_execute_command(): starting 11701 1727096132.60207: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096132.3359344-12443-56120930772839/ > /dev/null 2>&1 && sleep 0' 11701 1727096132.61020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.61024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.61028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.61030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.61096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.61117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.61152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.63092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.63111: stdout chunk (state=3): >>><<< 11701 1727096132.63124: stderr chunk (state=3): >>><<< 11701 1727096132.63145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.63156: handler run complete 11701 1727096132.63185: attempt loop complete, returning result 11701 1727096132.63198: _execute() done 11701 1727096132.63210: dumping result to json 11701 1727096132.63299: done dumping result, returning 11701 1727096132.63302: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-a05c-c957-0000000003b1] 11701 1727096132.63304: sending task result for task 0afff68d-5257-a05c-c957-0000000003b1 11701 1727096132.63375: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b1 11701 1727096132.63378: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11701 1727096132.63437: no more pending results, returning what we have 11701 1727096132.63441: results queue empty 11701 1727096132.63442: checking for any_errors_fatal 11701 1727096132.63446: done checking for any_errors_fatal 11701 1727096132.63447: checking for max_fail_percentage 11701 1727096132.63449: done checking for max_fail_percentage 11701 1727096132.63450: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.63451: done checking to see if all hosts have failed 11701 1727096132.63452: getting the remaining hosts for this loop 11701 1727096132.63453: done getting the remaining hosts for this loop 11701 1727096132.63457: getting the next task for host managed_node3 11701 1727096132.63465: done getting next task for host managed_node3 11701 1727096132.63470: ^ task is: TASK: Set NM profile exist flag based on the profile files 11701 1727096132.63474: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.63478: getting variables 11701 1727096132.63480: in VariableManager get_vars() 11701 1727096132.63523: Calling all_inventory to load vars for managed_node3 11701 1727096132.63526: Calling groups_inventory to load vars for managed_node3 11701 1727096132.63528: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.63540: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.63543: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.63546: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.65344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.66912: done with get_vars() 11701 1727096132.66942: done getting variables 11701 1727096132.67011: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:32 -0400 (0:00:00.374) 0:00:16.635 ****** 11701 1727096132.67044: entering _queue_task() for managed_node3/set_fact 11701 1727096132.67400: worker is 1 (out of 1 available) 11701 1727096132.67412: exiting _queue_task() for managed_node3/set_fact 11701 1727096132.67424: done queuing things up, now waiting for results queue to drain 11701 1727096132.67425: waiting for pending results... 11701 1727096132.67795: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11701 1727096132.67833: in run() - task 0afff68d-5257-a05c-c957-0000000003b2 11701 1727096132.67892: variable 'ansible_search_path' from source: unknown 11701 1727096132.67895: variable 'ansible_search_path' from source: unknown 11701 1727096132.67912: calling self._execute() 11701 1727096132.68020: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.68031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.68045: variable 'omit' from source: magic vars 11701 1727096132.68672: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.68676: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.68678: variable 'profile_stat' from source: set_fact 11701 1727096132.68681: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096132.68683: when evaluation is False, skipping this task 11701 1727096132.68685: _execute() done 11701 1727096132.68688: dumping result to json 11701 1727096132.68690: done dumping result, returning 11701 1727096132.68693: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-a05c-c957-0000000003b2] 11701 1727096132.68695: sending task result for task 0afff68d-5257-a05c-c957-0000000003b2 11701 1727096132.68761: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b2 11701 1727096132.68764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096132.68813: no more pending results, returning what we have 11701 1727096132.68817: results queue empty 11701 1727096132.68819: checking for any_errors_fatal 11701 1727096132.68826: done checking for any_errors_fatal 11701 1727096132.68827: checking for max_fail_percentage 11701 1727096132.68829: done checking for max_fail_percentage 11701 1727096132.68830: checking to see if all hosts have failed and the running result is not ok 11701 1727096132.68831: done checking to see if all hosts have failed 11701 1727096132.68832: getting the remaining hosts for this loop 11701 1727096132.68833: done getting the remaining hosts for this loop 11701 1727096132.68836: getting the next task for host managed_node3 11701 1727096132.68843: done getting next task for host managed_node3 11701 1727096132.68846: ^ task is: TASK: Get NM profile info 11701 1727096132.68853: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096132.68858: getting variables 11701 1727096132.68859: in VariableManager get_vars() 11701 1727096132.68905: Calling all_inventory to load vars for managed_node3 11701 1727096132.68908: Calling groups_inventory to load vars for managed_node3 11701 1727096132.68911: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096132.68925: Calling all_plugins_play to load vars for managed_node3 11701 1727096132.68928: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096132.68931: Calling groups_plugins_play to load vars for managed_node3 11701 1727096132.70605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096132.72155: done with get_vars() 11701 1727096132.72188: done getting variables 11701 1727096132.72255: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:32 -0400 (0:00:00.052) 0:00:16.687 ****** 11701 1727096132.72289: entering _queue_task() for managed_node3/shell 11701 1727096132.72640: worker is 1 (out of 1 available) 11701 1727096132.72657: exiting _queue_task() for managed_node3/shell 11701 1727096132.72872: done queuing things up, now waiting for results queue to drain 11701 1727096132.72874: waiting for pending results... 11701 1727096132.72955: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11701 1727096132.73082: in run() - task 0afff68d-5257-a05c-c957-0000000003b3 11701 1727096132.73107: variable 'ansible_search_path' from source: unknown 11701 1727096132.73114: variable 'ansible_search_path' from source: unknown 11701 1727096132.73159: calling self._execute() 11701 1727096132.73269: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.73315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.73318: variable 'omit' from source: magic vars 11701 1727096132.73689: variable 'ansible_distribution_major_version' from source: facts 11701 1727096132.73706: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096132.73716: variable 'omit' from source: magic vars 11701 1727096132.73777: variable 'omit' from source: magic vars 11701 1727096132.73969: variable 'profile' from source: include params 11701 1727096132.73973: variable 'item' from source: include params 11701 1727096132.73975: variable 'item' from source: include params 11701 1727096132.73997: variable 'omit' from source: magic vars 11701 1727096132.74043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096132.74095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096132.74119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096132.74141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.74160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096132.74200: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096132.74209: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.74216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.74328: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096132.74339: Set connection var ansible_timeout to 10 11701 1727096132.74346: Set connection var ansible_shell_type to sh 11701 1727096132.74358: Set connection var ansible_shell_executable to /bin/sh 11701 1727096132.74399: Set connection var ansible_connection to ssh 11701 1727096132.74402: Set connection var ansible_pipelining to False 11701 1727096132.74408: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.74415: variable 'ansible_connection' from source: unknown 11701 1727096132.74421: variable 'ansible_module_compression' from source: unknown 11701 1727096132.74426: variable 'ansible_shell_type' from source: unknown 11701 1727096132.74432: variable 'ansible_shell_executable' from source: unknown 11701 1727096132.74438: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096132.74444: variable 'ansible_pipelining' from source: unknown 11701 1727096132.74507: variable 'ansible_timeout' from source: unknown 11701 1727096132.74510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096132.74602: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096132.74621: variable 'omit' from source: magic vars 11701 1727096132.74631: starting attempt loop 11701 1727096132.74637: running the handler 11701 1727096132.74653: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096132.74679: _low_level_execute_command(): starting 11701 1727096132.74692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096132.75430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096132.75447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.75464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.75495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096132.75598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.75652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.75697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.77392: stdout chunk (state=3): >>>/root <<< 11701 1727096132.77542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.77546: stdout chunk (state=3): >>><<< 11701 1727096132.77549: stderr chunk (state=3): >>><<< 11701 1727096132.77575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.77680: _low_level_execute_command(): starting 11701 1727096132.77684: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776 `" && echo ansible-tmp-1727096132.7758222-12469-115100058274776="` echo /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776 `" ) && sleep 0' 11701 1727096132.78227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096132.78254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.78272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.78292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096132.78391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.78410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.78424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.78446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.78524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.80527: stdout chunk (state=3): >>>ansible-tmp-1727096132.7758222-12469-115100058274776=/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776 <<< 11701 1727096132.80660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.80682: stdout chunk (state=3): >>><<< 11701 1727096132.80695: stderr chunk (state=3): >>><<< 11701 1727096132.80718: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096132.7758222-12469-115100058274776=/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.80874: variable 'ansible_module_compression' from source: unknown 11701 1727096132.80876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096132.80878: variable 'ansible_facts' from source: unknown 11701 1727096132.81176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py 11701 1727096132.81303: Sending initial data 11701 1727096132.81313: Sent initial data (156 bytes) 11701 1727096132.82059: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.82116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.82135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.82163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.82301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.83927: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096132.83994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096132.84044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpsnh1dtgd /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py <<< 11701 1727096132.84047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py" <<< 11701 1727096132.84088: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpsnh1dtgd" to remote "/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py" <<< 11701 1727096132.84836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.84910: stderr chunk (state=3): >>><<< 11701 1727096132.85040: stdout chunk (state=3): >>><<< 11701 1727096132.85043: done transferring module to remote 11701 1727096132.85046: _low_level_execute_command(): starting 11701 1727096132.85049: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/ /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py && sleep 0' 11701 1727096132.86653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096132.86669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096132.86686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096132.86702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096132.86903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.86939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.86984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.87014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096132.88935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096132.88946: stdout chunk (state=3): >>><<< 11701 1727096132.88963: stderr chunk (state=3): >>><<< 11701 1727096132.88990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096132.88998: _low_level_execute_command(): starting 11701 1727096132.89008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/AnsiballZ_command.py && sleep 0' 11701 1727096132.90188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096132.90337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096132.90480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096132.90499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096132.90526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096132.90714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096133.08232: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:55:33.058924", "end": "2024-09-23 08:55:33.079384", "delta": "0:00:00.020460", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096133.09979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096133.09991: stdout chunk (state=3): >>><<< 11701 1727096133.10004: stderr chunk (state=3): >>><<< 11701 1727096133.10030: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:55:33.058924", "end": "2024-09-23 08:55:33.079384", "delta": "0:00:00.020460", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096133.10084: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096133.10166: _low_level_execute_command(): starting 11701 1727096133.10172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096132.7758222-12469-115100058274776/ > /dev/null 2>&1 && sleep 0' 11701 1727096133.10754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096133.10773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096133.10792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096133.10841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096133.10855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096133.10871: stderr chunk (state=3): >>>debug2: match found <<< 11701 1727096133.10948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096133.10962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096133.10983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096133.10994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096133.11062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096133.12958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096133.12991: stdout chunk (state=3): >>><<< 11701 1727096133.13015: stderr chunk (state=3): >>><<< 11701 1727096133.13049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096133.13177: handler run complete 11701 1727096133.13180: Evaluated conditional (False): False 11701 1727096133.13183: attempt loop complete, returning result 11701 1727096133.13185: _execute() done 11701 1727096133.13187: dumping result to json 11701 1727096133.13189: done dumping result, returning 11701 1727096133.13191: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-a05c-c957-0000000003b3] 11701 1727096133.13193: sending task result for task 0afff68d-5257-a05c-c957-0000000003b3 11701 1727096133.13263: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b3 11701 1727096133.13267: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.020460", "end": "2024-09-23 08:55:33.079384", "rc": 0, "start": "2024-09-23 08:55:33.058924" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11701 1727096133.13344: no more pending results, returning what we have 11701 1727096133.13348: results queue empty 11701 1727096133.13349: checking for any_errors_fatal 11701 1727096133.13354: done checking for any_errors_fatal 11701 1727096133.13355: checking for max_fail_percentage 11701 1727096133.13357: done checking for max_fail_percentage 11701 1727096133.13359: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.13360: done checking to see if all hosts have failed 11701 1727096133.13361: getting the remaining hosts for this loop 11701 1727096133.13362: done getting the remaining hosts for this loop 11701 1727096133.13365: getting the next task for host managed_node3 11701 1727096133.13402: done getting next task for host managed_node3 11701 1727096133.13406: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096133.13409: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.13414: getting variables 11701 1727096133.13416: in VariableManager get_vars() 11701 1727096133.13529: Calling all_inventory to load vars for managed_node3 11701 1727096133.13532: Calling groups_inventory to load vars for managed_node3 11701 1727096133.13535: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.13547: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.13550: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.13554: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.15530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.17313: done with get_vars() 11701 1727096133.17339: done getting variables 11701 1727096133.17400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:33 -0400 (0:00:00.451) 0:00:17.138 ****** 11701 1727096133.17437: entering _queue_task() for managed_node3/set_fact 11701 1727096133.17885: worker is 1 (out of 1 available) 11701 1727096133.17899: exiting _queue_task() for managed_node3/set_fact 11701 1727096133.17910: done queuing things up, now waiting for results queue to drain 11701 1727096133.17912: waiting for pending results... 11701 1727096133.18332: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096133.18337: in run() - task 0afff68d-5257-a05c-c957-0000000003b4 11701 1727096133.18341: variable 'ansible_search_path' from source: unknown 11701 1727096133.18344: variable 'ansible_search_path' from source: unknown 11701 1727096133.18378: calling self._execute() 11701 1727096133.18509: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.18522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.18536: variable 'omit' from source: magic vars 11701 1727096133.18946: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.19027: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.19112: variable 'nm_profile_exists' from source: set_fact 11701 1727096133.19133: Evaluated conditional (nm_profile_exists.rc == 0): True 11701 1727096133.19258: variable 'omit' from source: magic vars 11701 1727096133.19263: variable 'omit' from source: magic vars 11701 1727096133.19265: variable 'omit' from source: magic vars 11701 1727096133.19288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.19336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.19377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.19409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.19428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.19473: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.19483: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.19497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.19620: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.19631: Set connection var ansible_timeout to 10 11701 1727096133.19651: Set connection var ansible_shell_type to sh 11701 1727096133.19690: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.19693: Set connection var ansible_connection to ssh 11701 1727096133.19696: Set connection var ansible_pipelining to False 11701 1727096133.19718: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.19733: variable 'ansible_connection' from source: unknown 11701 1727096133.19746: variable 'ansible_module_compression' from source: unknown 11701 1727096133.19799: variable 'ansible_shell_type' from source: unknown 11701 1727096133.19802: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.19804: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.19806: variable 'ansible_pipelining' from source: unknown 11701 1727096133.19808: variable 'ansible_timeout' from source: unknown 11701 1727096133.19813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.19944: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096133.19960: variable 'omit' from source: magic vars 11701 1727096133.19973: starting attempt loop 11701 1727096133.19994: running the handler 11701 1727096133.20015: handler run complete 11701 1727096133.20036: attempt loop complete, returning result 11701 1727096133.20039: _execute() done 11701 1727096133.20041: dumping result to json 11701 1727096133.20124: done dumping result, returning 11701 1727096133.20128: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-a05c-c957-0000000003b4] 11701 1727096133.20130: sending task result for task 0afff68d-5257-a05c-c957-0000000003b4 11701 1727096133.20202: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b4 11701 1727096133.20206: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11701 1727096133.20269: no more pending results, returning what we have 11701 1727096133.20273: results queue empty 11701 1727096133.20273: checking for any_errors_fatal 11701 1727096133.20282: done checking for any_errors_fatal 11701 1727096133.20283: checking for max_fail_percentage 11701 1727096133.20285: done checking for max_fail_percentage 11701 1727096133.20286: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.20287: done checking to see if all hosts have failed 11701 1727096133.20288: getting the remaining hosts for this loop 11701 1727096133.20290: done getting the remaining hosts for this loop 11701 1727096133.20293: getting the next task for host managed_node3 11701 1727096133.20304: done getting next task for host managed_node3 11701 1727096133.20307: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096133.20312: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.20316: getting variables 11701 1727096133.20318: in VariableManager get_vars() 11701 1727096133.20360: Calling all_inventory to load vars for managed_node3 11701 1727096133.20362: Calling groups_inventory to load vars for managed_node3 11701 1727096133.20365: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.20491: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.20495: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.20499: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.22016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.23721: done with get_vars() 11701 1727096133.23758: done getting variables 11701 1727096133.23824: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.23955: variable 'profile' from source: include params 11701 1727096133.23959: variable 'item' from source: include params 11701 1727096133.24018: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:33 -0400 (0:00:00.066) 0:00:17.205 ****** 11701 1727096133.24062: entering _queue_task() for managed_node3/command 11701 1727096133.24600: worker is 1 (out of 1 available) 11701 1727096133.24611: exiting _queue_task() for managed_node3/command 11701 1727096133.24619: done queuing things up, now waiting for results queue to drain 11701 1727096133.24621: waiting for pending results... 11701 1727096133.24862: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 11701 1727096133.24869: in run() - task 0afff68d-5257-a05c-c957-0000000003b6 11701 1727096133.24883: variable 'ansible_search_path' from source: unknown 11701 1727096133.24890: variable 'ansible_search_path' from source: unknown 11701 1727096133.24957: calling self._execute() 11701 1727096133.25034: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.25044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.25065: variable 'omit' from source: magic vars 11701 1727096133.25473: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.25477: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.25565: variable 'profile_stat' from source: set_fact 11701 1727096133.25585: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096133.25593: when evaluation is False, skipping this task 11701 1727096133.25598: _execute() done 11701 1727096133.25611: dumping result to json 11701 1727096133.25618: done dumping result, returning 11701 1727096133.25628: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-a05c-c957-0000000003b6] 11701 1727096133.25636: sending task result for task 0afff68d-5257-a05c-c957-0000000003b6 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096133.25885: no more pending results, returning what we have 11701 1727096133.25889: results queue empty 11701 1727096133.25890: checking for any_errors_fatal 11701 1727096133.25897: done checking for any_errors_fatal 11701 1727096133.25898: checking for max_fail_percentage 11701 1727096133.25900: done checking for max_fail_percentage 11701 1727096133.25901: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.25902: done checking to see if all hosts have failed 11701 1727096133.25903: getting the remaining hosts for this loop 11701 1727096133.25904: done getting the remaining hosts for this loop 11701 1727096133.25908: getting the next task for host managed_node3 11701 1727096133.25916: done getting next task for host managed_node3 11701 1727096133.25919: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096133.25923: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.25951: getting variables 11701 1727096133.25954: in VariableManager get_vars() 11701 1727096133.26002: Calling all_inventory to load vars for managed_node3 11701 1727096133.26005: Calling groups_inventory to load vars for managed_node3 11701 1727096133.26008: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.26023: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.26026: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.26029: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.26553: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b6 11701 1727096133.26556: WORKER PROCESS EXITING 11701 1727096133.28349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.30401: done with get_vars() 11701 1727096133.30434: done getting variables 11701 1727096133.30520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.30654: variable 'profile' from source: include params 11701 1727096133.30659: variable 'item' from source: include params 11701 1727096133.30740: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:33 -0400 (0:00:00.067) 0:00:17.272 ****** 11701 1727096133.30782: entering _queue_task() for managed_node3/set_fact 11701 1727096133.31401: worker is 1 (out of 1 available) 11701 1727096133.31410: exiting _queue_task() for managed_node3/set_fact 11701 1727096133.31420: done queuing things up, now waiting for results queue to drain 11701 1727096133.31422: waiting for pending results... 11701 1727096133.31588: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 11701 1727096133.31709: in run() - task 0afff68d-5257-a05c-c957-0000000003b7 11701 1727096133.31731: variable 'ansible_search_path' from source: unknown 11701 1727096133.31739: variable 'ansible_search_path' from source: unknown 11701 1727096133.31792: calling self._execute() 11701 1727096133.31890: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.31982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.31986: variable 'omit' from source: magic vars 11701 1727096133.32314: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.32334: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.32459: variable 'profile_stat' from source: set_fact 11701 1727096133.32481: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096133.32488: when evaluation is False, skipping this task 11701 1727096133.32496: _execute() done 11701 1727096133.32503: dumping result to json 11701 1727096133.32510: done dumping result, returning 11701 1727096133.32523: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-a05c-c957-0000000003b7] 11701 1727096133.32536: sending task result for task 0afff68d-5257-a05c-c957-0000000003b7 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096133.32683: no more pending results, returning what we have 11701 1727096133.32688: results queue empty 11701 1727096133.32689: checking for any_errors_fatal 11701 1727096133.32697: done checking for any_errors_fatal 11701 1727096133.32697: checking for max_fail_percentage 11701 1727096133.32699: done checking for max_fail_percentage 11701 1727096133.32700: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.32701: done checking to see if all hosts have failed 11701 1727096133.32701: getting the remaining hosts for this loop 11701 1727096133.32703: done getting the remaining hosts for this loop 11701 1727096133.32706: getting the next task for host managed_node3 11701 1727096133.32712: done getting next task for host managed_node3 11701 1727096133.32715: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11701 1727096133.32719: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.32724: getting variables 11701 1727096133.32726: in VariableManager get_vars() 11701 1727096133.32771: Calling all_inventory to load vars for managed_node3 11701 1727096133.32774: Calling groups_inventory to load vars for managed_node3 11701 1727096133.32777: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.32791: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.32794: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.32797: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.33499: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b7 11701 1727096133.33503: WORKER PROCESS EXITING 11701 1727096133.34539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.36238: done with get_vars() 11701 1727096133.36264: done getting variables 11701 1727096133.36323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.36452: variable 'profile' from source: include params 11701 1727096133.36456: variable 'item' from source: include params 11701 1727096133.36517: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:33 -0400 (0:00:00.057) 0:00:17.330 ****** 11701 1727096133.36548: entering _queue_task() for managed_node3/command 11701 1727096133.37095: worker is 1 (out of 1 available) 11701 1727096133.37104: exiting _queue_task() for managed_node3/command 11701 1727096133.37114: done queuing things up, now waiting for results queue to drain 11701 1727096133.37116: waiting for pending results... 11701 1727096133.37243: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 11701 1727096133.37370: in run() - task 0afff68d-5257-a05c-c957-0000000003b8 11701 1727096133.37391: variable 'ansible_search_path' from source: unknown 11701 1727096133.37400: variable 'ansible_search_path' from source: unknown 11701 1727096133.37453: calling self._execute() 11701 1727096133.37578: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.37590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.37670: variable 'omit' from source: magic vars 11701 1727096133.37962: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.38016: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.38144: variable 'profile_stat' from source: set_fact 11701 1727096133.38163: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096133.38173: when evaluation is False, skipping this task 11701 1727096133.38180: _execute() done 11701 1727096133.38187: dumping result to json 11701 1727096133.38194: done dumping result, returning 11701 1727096133.38209: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0afff68d-5257-a05c-c957-0000000003b8] 11701 1727096133.38222: sending task result for task 0afff68d-5257-a05c-c957-0000000003b8 11701 1727096133.38432: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b8 11701 1727096133.38436: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096133.38492: no more pending results, returning what we have 11701 1727096133.38496: results queue empty 11701 1727096133.38497: checking for any_errors_fatal 11701 1727096133.38504: done checking for any_errors_fatal 11701 1727096133.38505: checking for max_fail_percentage 11701 1727096133.38507: done checking for max_fail_percentage 11701 1727096133.38508: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.38509: done checking to see if all hosts have failed 11701 1727096133.38509: getting the remaining hosts for this loop 11701 1727096133.38511: done getting the remaining hosts for this loop 11701 1727096133.38514: getting the next task for host managed_node3 11701 1727096133.38521: done getting next task for host managed_node3 11701 1727096133.38525: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11701 1727096133.38645: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.38651: getting variables 11701 1727096133.38653: in VariableManager get_vars() 11701 1727096133.38699: Calling all_inventory to load vars for managed_node3 11701 1727096133.38702: Calling groups_inventory to load vars for managed_node3 11701 1727096133.38705: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.38718: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.38721: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.38724: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.40497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.44212: done with get_vars() 11701 1727096133.44246: done getting variables 11701 1727096133.44408: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.44511: variable 'profile' from source: include params 11701 1727096133.44514: variable 'item' from source: include params 11701 1727096133.44769: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:33 -0400 (0:00:00.082) 0:00:17.412 ****** 11701 1727096133.44804: entering _queue_task() for managed_node3/set_fact 11701 1727096133.45327: worker is 1 (out of 1 available) 11701 1727096133.45340: exiting _queue_task() for managed_node3/set_fact 11701 1727096133.45352: done queuing things up, now waiting for results queue to drain 11701 1727096133.45354: waiting for pending results... 11701 1727096133.46286: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 11701 1727096133.46293: in run() - task 0afff68d-5257-a05c-c957-0000000003b9 11701 1727096133.46296: variable 'ansible_search_path' from source: unknown 11701 1727096133.46299: variable 'ansible_search_path' from source: unknown 11701 1727096133.46409: calling self._execute() 11701 1727096133.46636: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.46639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.46642: variable 'omit' from source: magic vars 11701 1727096133.47316: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.47330: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.47776: variable 'profile_stat' from source: set_fact 11701 1727096133.47779: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096133.47782: when evaluation is False, skipping this task 11701 1727096133.47784: _execute() done 11701 1727096133.47786: dumping result to json 11701 1727096133.47789: done dumping result, returning 11701 1727096133.47792: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0afff68d-5257-a05c-c957-0000000003b9] 11701 1727096133.47794: sending task result for task 0afff68d-5257-a05c-c957-0000000003b9 11701 1727096133.47858: done sending task result for task 0afff68d-5257-a05c-c957-0000000003b9 11701 1727096133.47861: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096133.47927: no more pending results, returning what we have 11701 1727096133.47932: results queue empty 11701 1727096133.47933: checking for any_errors_fatal 11701 1727096133.47941: done checking for any_errors_fatal 11701 1727096133.47942: checking for max_fail_percentage 11701 1727096133.47944: done checking for max_fail_percentage 11701 1727096133.47945: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.47946: done checking to see if all hosts have failed 11701 1727096133.47947: getting the remaining hosts for this loop 11701 1727096133.47948: done getting the remaining hosts for this loop 11701 1727096133.47952: getting the next task for host managed_node3 11701 1727096133.47960: done getting next task for host managed_node3 11701 1727096133.47963: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11701 1727096133.47969: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.47975: getting variables 11701 1727096133.47976: in VariableManager get_vars() 11701 1727096133.48017: Calling all_inventory to load vars for managed_node3 11701 1727096133.48019: Calling groups_inventory to load vars for managed_node3 11701 1727096133.48022: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.48034: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.48036: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.48039: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.51276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.54615: done with get_vars() 11701 1727096133.54640: done getting variables 11701 1727096133.54906: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.55028: variable 'profile' from source: include params 11701 1727096133.55032: variable 'item' from source: include params 11701 1727096133.55294: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:33 -0400 (0:00:00.105) 0:00:17.517 ****** 11701 1727096133.55327: entering _queue_task() for managed_node3/assert 11701 1727096133.56085: worker is 1 (out of 1 available) 11701 1727096133.56098: exiting _queue_task() for managed_node3/assert 11701 1727096133.56111: done queuing things up, now waiting for results queue to drain 11701 1727096133.56113: waiting for pending results... 11701 1727096133.56647: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 11701 1727096133.56743: in run() - task 0afff68d-5257-a05c-c957-000000000260 11701 1727096133.56765: variable 'ansible_search_path' from source: unknown 11701 1727096133.56975: variable 'ansible_search_path' from source: unknown 11701 1727096133.57006: calling self._execute() 11701 1727096133.57099: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.57105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.57116: variable 'omit' from source: magic vars 11701 1727096133.57877: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.57887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.57894: variable 'omit' from source: magic vars 11701 1727096133.57929: variable 'omit' from source: magic vars 11701 1727096133.58237: variable 'profile' from source: include params 11701 1727096133.58240: variable 'item' from source: include params 11701 1727096133.58311: variable 'item' from source: include params 11701 1727096133.58331: variable 'omit' from source: magic vars 11701 1727096133.58582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.58619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.58640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.58662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.58676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.58709: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.58712: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.58717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.59052: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.59055: Set connection var ansible_timeout to 10 11701 1727096133.59058: Set connection var ansible_shell_type to sh 11701 1727096133.59060: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.59061: Set connection var ansible_connection to ssh 11701 1727096133.59064: Set connection var ansible_pipelining to False 11701 1727096133.59080: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.59082: variable 'ansible_connection' from source: unknown 11701 1727096133.59085: variable 'ansible_module_compression' from source: unknown 11701 1727096133.59087: variable 'ansible_shell_type' from source: unknown 11701 1727096133.59089: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.59162: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.59165: variable 'ansible_pipelining' from source: unknown 11701 1727096133.59169: variable 'ansible_timeout' from source: unknown 11701 1727096133.59171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.59452: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096133.59466: variable 'omit' from source: magic vars 11701 1727096133.59473: starting attempt loop 11701 1727096133.59477: running the handler 11701 1727096133.59974: variable 'lsr_net_profile_exists' from source: set_fact 11701 1727096133.59976: Evaluated conditional (lsr_net_profile_exists): True 11701 1727096133.59979: handler run complete 11701 1727096133.59981: attempt loop complete, returning result 11701 1727096133.59982: _execute() done 11701 1727096133.59984: dumping result to json 11701 1727096133.59987: done dumping result, returning 11701 1727096133.59988: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0afff68d-5257-a05c-c957-000000000260] 11701 1727096133.59990: sending task result for task 0afff68d-5257-a05c-c957-000000000260 11701 1727096133.60054: done sending task result for task 0afff68d-5257-a05c-c957-000000000260 11701 1727096133.60057: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096133.60122: no more pending results, returning what we have 11701 1727096133.60126: results queue empty 11701 1727096133.60127: checking for any_errors_fatal 11701 1727096133.60134: done checking for any_errors_fatal 11701 1727096133.60135: checking for max_fail_percentage 11701 1727096133.60137: done checking for max_fail_percentage 11701 1727096133.60138: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.60139: done checking to see if all hosts have failed 11701 1727096133.60140: getting the remaining hosts for this loop 11701 1727096133.60141: done getting the remaining hosts for this loop 11701 1727096133.60145: getting the next task for host managed_node3 11701 1727096133.60151: done getting next task for host managed_node3 11701 1727096133.60153: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11701 1727096133.60157: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.60161: getting variables 11701 1727096133.60163: in VariableManager get_vars() 11701 1727096133.60207: Calling all_inventory to load vars for managed_node3 11701 1727096133.60210: Calling groups_inventory to load vars for managed_node3 11701 1727096133.60213: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.60225: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.60228: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.60231: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.62970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.65494: done with get_vars() 11701 1727096133.65533: done getting variables 11701 1727096133.65599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.65721: variable 'profile' from source: include params 11701 1727096133.65725: variable 'item' from source: include params 11701 1727096133.65785: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:33 -0400 (0:00:00.104) 0:00:17.622 ****** 11701 1727096133.65818: entering _queue_task() for managed_node3/assert 11701 1727096133.66211: worker is 1 (out of 1 available) 11701 1727096133.66225: exiting _queue_task() for managed_node3/assert 11701 1727096133.66239: done queuing things up, now waiting for results queue to drain 11701 1727096133.66240: waiting for pending results... 11701 1727096133.66619: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 11701 1727096133.66625: in run() - task 0afff68d-5257-a05c-c957-000000000261 11701 1727096133.66658: variable 'ansible_search_path' from source: unknown 11701 1727096133.66673: variable 'ansible_search_path' from source: unknown 11701 1727096133.66715: calling self._execute() 11701 1727096133.66836: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.66859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.66877: variable 'omit' from source: magic vars 11701 1727096133.67584: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.67589: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.67592: variable 'omit' from source: magic vars 11701 1727096133.67594: variable 'omit' from source: magic vars 11701 1727096133.67817: variable 'profile' from source: include params 11701 1727096133.67909: variable 'item' from source: include params 11701 1727096133.67953: variable 'item' from source: include params 11701 1727096133.68236: variable 'omit' from source: magic vars 11701 1727096133.68239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.68242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.68271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.68293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.68308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.68383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.68458: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.68475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.68589: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.68600: Set connection var ansible_timeout to 10 11701 1727096133.68607: Set connection var ansible_shell_type to sh 11701 1727096133.68615: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.68621: Set connection var ansible_connection to ssh 11701 1727096133.68638: Set connection var ansible_pipelining to False 11701 1727096133.68685: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.68688: variable 'ansible_connection' from source: unknown 11701 1727096133.68690: variable 'ansible_module_compression' from source: unknown 11701 1727096133.68693: variable 'ansible_shell_type' from source: unknown 11701 1727096133.68695: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.68697: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.68699: variable 'ansible_pipelining' from source: unknown 11701 1727096133.68701: variable 'ansible_timeout' from source: unknown 11701 1727096133.68703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.68901: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096133.68907: variable 'omit' from source: magic vars 11701 1727096133.68913: starting attempt loop 11701 1727096133.68916: running the handler 11701 1727096133.69039: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11701 1727096133.69043: Evaluated conditional (lsr_net_profile_ansible_managed): True 11701 1727096133.69057: handler run complete 11701 1727096133.69073: attempt loop complete, returning result 11701 1727096133.69076: _execute() done 11701 1727096133.69079: dumping result to json 11701 1727096133.69082: done dumping result, returning 11701 1727096133.69089: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0afff68d-5257-a05c-c957-000000000261] 11701 1727096133.69092: sending task result for task 0afff68d-5257-a05c-c957-000000000261 11701 1727096133.69186: done sending task result for task 0afff68d-5257-a05c-c957-000000000261 11701 1727096133.69188: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096133.69262: no more pending results, returning what we have 11701 1727096133.69266: results queue empty 11701 1727096133.69270: checking for any_errors_fatal 11701 1727096133.69279: done checking for any_errors_fatal 11701 1727096133.69280: checking for max_fail_percentage 11701 1727096133.69282: done checking for max_fail_percentage 11701 1727096133.69283: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.69284: done checking to see if all hosts have failed 11701 1727096133.69285: getting the remaining hosts for this loop 11701 1727096133.69286: done getting the remaining hosts for this loop 11701 1727096133.69290: getting the next task for host managed_node3 11701 1727096133.69297: done getting next task for host managed_node3 11701 1727096133.69300: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11701 1727096133.69303: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.69308: getting variables 11701 1727096133.69309: in VariableManager get_vars() 11701 1727096133.69356: Calling all_inventory to load vars for managed_node3 11701 1727096133.69359: Calling groups_inventory to load vars for managed_node3 11701 1727096133.69362: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.69379: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.69382: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.69385: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.71240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.72959: done with get_vars() 11701 1727096133.72995: done getting variables 11701 1727096133.73064: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096133.73288: variable 'profile' from source: include params 11701 1727096133.73314: variable 'item' from source: include params 11701 1727096133.73410: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:33 -0400 (0:00:00.076) 0:00:17.699 ****** 11701 1727096133.73451: entering _queue_task() for managed_node3/assert 11701 1727096133.73833: worker is 1 (out of 1 available) 11701 1727096133.73845: exiting _queue_task() for managed_node3/assert 11701 1727096133.73857: done queuing things up, now waiting for results queue to drain 11701 1727096133.73859: waiting for pending results... 11701 1727096133.74198: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 11701 1727096133.74274: in run() - task 0afff68d-5257-a05c-c957-000000000262 11701 1727096133.74279: variable 'ansible_search_path' from source: unknown 11701 1727096133.74282: variable 'ansible_search_path' from source: unknown 11701 1727096133.74322: calling self._execute() 11701 1727096133.74416: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.74421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.74432: variable 'omit' from source: magic vars 11701 1727096133.74804: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.74809: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.74872: variable 'omit' from source: magic vars 11701 1727096133.74875: variable 'omit' from source: magic vars 11701 1727096133.74974: variable 'profile' from source: include params 11701 1727096133.74978: variable 'item' from source: include params 11701 1727096133.75039: variable 'item' from source: include params 11701 1727096133.75065: variable 'omit' from source: magic vars 11701 1727096133.75111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.75150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.75177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.75206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.75210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.75249: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.75255: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.75258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.75358: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.75362: Set connection var ansible_timeout to 10 11701 1727096133.75365: Set connection var ansible_shell_type to sh 11701 1727096133.75372: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.75375: Set connection var ansible_connection to ssh 11701 1727096133.75389: Set connection var ansible_pipelining to False 11701 1727096133.75774: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.75777: variable 'ansible_connection' from source: unknown 11701 1727096133.75779: variable 'ansible_module_compression' from source: unknown 11701 1727096133.75780: variable 'ansible_shell_type' from source: unknown 11701 1727096133.75782: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.75783: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.75785: variable 'ansible_pipelining' from source: unknown 11701 1727096133.75787: variable 'ansible_timeout' from source: unknown 11701 1727096133.75788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.75790: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096133.75792: variable 'omit' from source: magic vars 11701 1727096133.75794: starting attempt loop 11701 1727096133.75795: running the handler 11701 1727096133.75797: variable 'lsr_net_profile_fingerprint' from source: set_fact 11701 1727096133.75799: Evaluated conditional (lsr_net_profile_fingerprint): True 11701 1727096133.75800: handler run complete 11701 1727096133.75802: attempt loop complete, returning result 11701 1727096133.75803: _execute() done 11701 1727096133.75805: dumping result to json 11701 1727096133.75807: done dumping result, returning 11701 1727096133.75808: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0afff68d-5257-a05c-c957-000000000262] 11701 1727096133.75812: sending task result for task 0afff68d-5257-a05c-c957-000000000262 11701 1727096133.75874: done sending task result for task 0afff68d-5257-a05c-c957-000000000262 11701 1727096133.75878: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096133.75926: no more pending results, returning what we have 11701 1727096133.75929: results queue empty 11701 1727096133.75930: checking for any_errors_fatal 11701 1727096133.75939: done checking for any_errors_fatal 11701 1727096133.75939: checking for max_fail_percentage 11701 1727096133.75941: done checking for max_fail_percentage 11701 1727096133.75942: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.75943: done checking to see if all hosts have failed 11701 1727096133.75944: getting the remaining hosts for this loop 11701 1727096133.75945: done getting the remaining hosts for this loop 11701 1727096133.75948: getting the next task for host managed_node3 11701 1727096133.75958: done getting next task for host managed_node3 11701 1727096133.75961: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11701 1727096133.75965: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.75971: getting variables 11701 1727096133.75973: in VariableManager get_vars() 11701 1727096133.76017: Calling all_inventory to load vars for managed_node3 11701 1727096133.76020: Calling groups_inventory to load vars for managed_node3 11701 1727096133.76023: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.76035: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.76038: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.76041: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.78116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.79952: done with get_vars() 11701 1727096133.79989: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:33 -0400 (0:00:00.066) 0:00:17.765 ****** 11701 1727096133.80084: entering _queue_task() for managed_node3/include_tasks 11701 1727096133.80444: worker is 1 (out of 1 available) 11701 1727096133.80455: exiting _queue_task() for managed_node3/include_tasks 11701 1727096133.80467: done queuing things up, now waiting for results queue to drain 11701 1727096133.80615: waiting for pending results... 11701 1727096133.80834: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11701 1727096133.81381: in run() - task 0afff68d-5257-a05c-c957-000000000266 11701 1727096133.81385: variable 'ansible_search_path' from source: unknown 11701 1727096133.81389: variable 'ansible_search_path' from source: unknown 11701 1727096133.81392: calling self._execute() 11701 1727096133.81394: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.81397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.81400: variable 'omit' from source: magic vars 11701 1727096133.81604: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.81615: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.81621: _execute() done 11701 1727096133.81624: dumping result to json 11701 1727096133.81627: done dumping result, returning 11701 1727096133.81636: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-a05c-c957-000000000266] 11701 1727096133.81641: sending task result for task 0afff68d-5257-a05c-c957-000000000266 11701 1727096133.81734: done sending task result for task 0afff68d-5257-a05c-c957-000000000266 11701 1727096133.81738: WORKER PROCESS EXITING 11701 1727096133.81771: no more pending results, returning what we have 11701 1727096133.81777: in VariableManager get_vars() 11701 1727096133.81838: Calling all_inventory to load vars for managed_node3 11701 1727096133.81841: Calling groups_inventory to load vars for managed_node3 11701 1727096133.81844: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.81860: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.81863: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.81866: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.83726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.85311: done with get_vars() 11701 1727096133.85338: variable 'ansible_search_path' from source: unknown 11701 1727096133.85339: variable 'ansible_search_path' from source: unknown 11701 1727096133.85382: we have included files to process 11701 1727096133.85387: generating all_blocks data 11701 1727096133.85390: done generating all_blocks data 11701 1727096133.85394: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096133.85395: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096133.85398: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096133.86342: done processing included file 11701 1727096133.86344: iterating over new_blocks loaded from include file 11701 1727096133.86346: in VariableManager get_vars() 11701 1727096133.86372: done with get_vars() 11701 1727096133.86375: filtering new block on tags 11701 1727096133.86400: done filtering new block on tags 11701 1727096133.86402: in VariableManager get_vars() 11701 1727096133.86420: done with get_vars() 11701 1727096133.86421: filtering new block on tags 11701 1727096133.86442: done filtering new block on tags 11701 1727096133.86444: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11701 1727096133.86449: extending task lists for all hosts with included blocks 11701 1727096133.86630: done extending task lists 11701 1727096133.86631: done processing included files 11701 1727096133.86632: results queue empty 11701 1727096133.86633: checking for any_errors_fatal 11701 1727096133.86636: done checking for any_errors_fatal 11701 1727096133.86637: checking for max_fail_percentage 11701 1727096133.86638: done checking for max_fail_percentage 11701 1727096133.86638: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.86639: done checking to see if all hosts have failed 11701 1727096133.86640: getting the remaining hosts for this loop 11701 1727096133.86641: done getting the remaining hosts for this loop 11701 1727096133.86644: getting the next task for host managed_node3 11701 1727096133.86648: done getting next task for host managed_node3 11701 1727096133.86650: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096133.86653: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.86655: getting variables 11701 1727096133.86656: in VariableManager get_vars() 11701 1727096133.86671: Calling all_inventory to load vars for managed_node3 11701 1727096133.86674: Calling groups_inventory to load vars for managed_node3 11701 1727096133.86676: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.86681: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.86683: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.86686: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.87874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.89511: done with get_vars() 11701 1727096133.89539: done getting variables 11701 1727096133.89588: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:33 -0400 (0:00:00.095) 0:00:17.860 ****** 11701 1727096133.89622: entering _queue_task() for managed_node3/set_fact 11701 1727096133.89989: worker is 1 (out of 1 available) 11701 1727096133.90003: exiting _queue_task() for managed_node3/set_fact 11701 1727096133.90015: done queuing things up, now waiting for results queue to drain 11701 1727096133.90017: waiting for pending results... 11701 1727096133.90385: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096133.90401: in run() - task 0afff68d-5257-a05c-c957-0000000003f8 11701 1727096133.90424: variable 'ansible_search_path' from source: unknown 11701 1727096133.90430: variable 'ansible_search_path' from source: unknown 11701 1727096133.90468: calling self._execute() 11701 1727096133.90566: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.90579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.90592: variable 'omit' from source: magic vars 11701 1727096133.90949: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.90955: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.90961: variable 'omit' from source: magic vars 11701 1727096133.91005: variable 'omit' from source: magic vars 11701 1727096133.91060: variable 'omit' from source: magic vars 11701 1727096133.91097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.91170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.91174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.91194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.91210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.91244: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.91256: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.91372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.91384: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.91397: Set connection var ansible_timeout to 10 11701 1727096133.91404: Set connection var ansible_shell_type to sh 11701 1727096133.91416: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.91423: Set connection var ansible_connection to ssh 11701 1727096133.91439: Set connection var ansible_pipelining to False 11701 1727096133.91466: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.91488: variable 'ansible_connection' from source: unknown 11701 1727096133.91499: variable 'ansible_module_compression' from source: unknown 11701 1727096133.91507: variable 'ansible_shell_type' from source: unknown 11701 1727096133.91604: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.91608: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.91610: variable 'ansible_pipelining' from source: unknown 11701 1727096133.91613: variable 'ansible_timeout' from source: unknown 11701 1727096133.91615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.91721: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096133.91737: variable 'omit' from source: magic vars 11701 1727096133.91761: starting attempt loop 11701 1727096133.91771: running the handler 11701 1727096133.91795: handler run complete 11701 1727096133.91821: attempt loop complete, returning result 11701 1727096133.91832: _execute() done 11701 1727096133.91839: dumping result to json 11701 1727096133.91846: done dumping result, returning 11701 1727096133.91880: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-a05c-c957-0000000003f8] 11701 1727096133.91883: sending task result for task 0afff68d-5257-a05c-c957-0000000003f8 11701 1727096133.92087: done sending task result for task 0afff68d-5257-a05c-c957-0000000003f8 11701 1727096133.92091: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11701 1727096133.92303: no more pending results, returning what we have 11701 1727096133.92305: results queue empty 11701 1727096133.92306: checking for any_errors_fatal 11701 1727096133.92308: done checking for any_errors_fatal 11701 1727096133.92309: checking for max_fail_percentage 11701 1727096133.92310: done checking for max_fail_percentage 11701 1727096133.92311: checking to see if all hosts have failed and the running result is not ok 11701 1727096133.92312: done checking to see if all hosts have failed 11701 1727096133.92312: getting the remaining hosts for this loop 11701 1727096133.92314: done getting the remaining hosts for this loop 11701 1727096133.92316: getting the next task for host managed_node3 11701 1727096133.92321: done getting next task for host managed_node3 11701 1727096133.92324: ^ task is: TASK: Stat profile file 11701 1727096133.92327: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096133.92330: getting variables 11701 1727096133.92331: in VariableManager get_vars() 11701 1727096133.92366: Calling all_inventory to load vars for managed_node3 11701 1727096133.92374: Calling groups_inventory to load vars for managed_node3 11701 1727096133.92377: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096133.92386: Calling all_plugins_play to load vars for managed_node3 11701 1727096133.92389: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096133.92392: Calling groups_plugins_play to load vars for managed_node3 11701 1727096133.93698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096133.95252: done with get_vars() 11701 1727096133.95279: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:33 -0400 (0:00:00.057) 0:00:17.918 ****** 11701 1727096133.95384: entering _queue_task() for managed_node3/stat 11701 1727096133.95737: worker is 1 (out of 1 available) 11701 1727096133.95750: exiting _queue_task() for managed_node3/stat 11701 1727096133.95762: done queuing things up, now waiting for results queue to drain 11701 1727096133.95764: waiting for pending results... 11701 1727096133.96126: running TaskExecutor() for managed_node3/TASK: Stat profile file 11701 1727096133.96161: in run() - task 0afff68d-5257-a05c-c957-0000000003f9 11701 1727096133.96180: variable 'ansible_search_path' from source: unknown 11701 1727096133.96184: variable 'ansible_search_path' from source: unknown 11701 1727096133.96225: calling self._execute() 11701 1727096133.96375: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.96378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.96381: variable 'omit' from source: magic vars 11701 1727096133.96703: variable 'ansible_distribution_major_version' from source: facts 11701 1727096133.96714: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096133.96720: variable 'omit' from source: magic vars 11701 1727096133.96777: variable 'omit' from source: magic vars 11701 1727096133.96880: variable 'profile' from source: include params 11701 1727096133.96883: variable 'item' from source: include params 11701 1727096133.96974: variable 'item' from source: include params 11701 1727096133.96978: variable 'omit' from source: magic vars 11701 1727096133.97009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096133.97046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096133.97082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096133.97090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.97103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096133.97141: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096133.97144: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.97146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.97306: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096133.97309: Set connection var ansible_timeout to 10 11701 1727096133.97311: Set connection var ansible_shell_type to sh 11701 1727096133.97314: Set connection var ansible_shell_executable to /bin/sh 11701 1727096133.97317: Set connection var ansible_connection to ssh 11701 1727096133.97319: Set connection var ansible_pipelining to False 11701 1727096133.97321: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.97323: variable 'ansible_connection' from source: unknown 11701 1727096133.97326: variable 'ansible_module_compression' from source: unknown 11701 1727096133.97327: variable 'ansible_shell_type' from source: unknown 11701 1727096133.97329: variable 'ansible_shell_executable' from source: unknown 11701 1727096133.97331: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096133.97333: variable 'ansible_pipelining' from source: unknown 11701 1727096133.97335: variable 'ansible_timeout' from source: unknown 11701 1727096133.97337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096133.97536: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096133.97573: variable 'omit' from source: magic vars 11701 1727096133.97577: starting attempt loop 11701 1727096133.97579: running the handler 11701 1727096133.97581: _low_level_execute_command(): starting 11701 1727096133.97584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096133.98385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096133.98416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096133.98430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096133.98473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096133.98524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.00200: stdout chunk (state=3): >>>/root <<< 11701 1727096134.00362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.00440: stdout chunk (state=3): >>><<< 11701 1727096134.00443: stderr chunk (state=3): >>><<< 11701 1727096134.00464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.00573: _low_level_execute_command(): starting 11701 1727096134.00579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082 `" && echo ansible-tmp-1727096134.0048308-12523-243344516702082="` echo /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082 `" ) && sleep 0' 11701 1727096134.01524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.01535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096134.01538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096134.01549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096134.01552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.01626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.01713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.01811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.03797: stdout chunk (state=3): >>>ansible-tmp-1727096134.0048308-12523-243344516702082=/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082 <<< 11701 1727096134.04216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.04221: stdout chunk (state=3): >>><<< 11701 1727096134.04223: stderr chunk (state=3): >>><<< 11701 1727096134.04226: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096134.0048308-12523-243344516702082=/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.04275: variable 'ansible_module_compression' from source: unknown 11701 1727096134.04513: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096134.04516: variable 'ansible_facts' from source: unknown 11701 1727096134.04875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py 11701 1727096134.05004: Sending initial data 11701 1727096134.05015: Sent initial data (153 bytes) 11701 1727096134.05909: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096134.05925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.05939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.05961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096134.05983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096134.05995: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096134.06076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.06104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.06127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.06142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.06206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.08013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096134.08118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096134.08165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjp8l_kct /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py <<< 11701 1727096134.08175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py" <<< 11701 1727096134.08199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjp8l_kct" to remote "/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py" <<< 11701 1727096134.09479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.09483: stderr chunk (state=3): >>><<< 11701 1727096134.09486: stdout chunk (state=3): >>><<< 11701 1727096134.09488: done transferring module to remote 11701 1727096134.09490: _low_level_execute_command(): starting 11701 1727096134.09492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/ /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py && sleep 0' 11701 1727096134.10842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.10851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.10853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.10882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.10955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.12854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.12872: stdout chunk (state=3): >>><<< 11701 1727096134.12892: stderr chunk (state=3): >>><<< 11701 1727096134.12914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.12922: _low_level_execute_command(): starting 11701 1727096134.13000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/AnsiballZ_stat.py && sleep 0' 11701 1727096134.13523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096134.13538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.13552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.13575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096134.13677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.13699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.13844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.29595: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096134.31381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096134.31385: stdout chunk (state=3): >>><<< 11701 1727096134.31388: stderr chunk (state=3): >>><<< 11701 1727096134.31390: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096134.31392: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096134.31395: _low_level_execute_command(): starting 11701 1727096134.31397: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096134.0048308-12523-243344516702082/ > /dev/null 2>&1 && sleep 0' 11701 1727096134.32003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096134.32020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.32098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.32102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.32105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096134.32107: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.32108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.32179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.32200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.32262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.34135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.34140: stderr chunk (state=3): >>><<< 11701 1727096134.34142: stdout chunk (state=3): >>><<< 11701 1727096134.34160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.34166: handler run complete 11701 1727096134.34184: attempt loop complete, returning result 11701 1727096134.34187: _execute() done 11701 1727096134.34189: dumping result to json 11701 1727096134.34192: done dumping result, returning 11701 1727096134.34200: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-a05c-c957-0000000003f9] 11701 1727096134.34202: sending task result for task 0afff68d-5257-a05c-c957-0000000003f9 11701 1727096134.34316: done sending task result for task 0afff68d-5257-a05c-c957-0000000003f9 11701 1727096134.34319: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11701 1727096134.34386: no more pending results, returning what we have 11701 1727096134.34389: results queue empty 11701 1727096134.34390: checking for any_errors_fatal 11701 1727096134.34398: done checking for any_errors_fatal 11701 1727096134.34399: checking for max_fail_percentage 11701 1727096134.34401: done checking for max_fail_percentage 11701 1727096134.34401: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.34402: done checking to see if all hosts have failed 11701 1727096134.34403: getting the remaining hosts for this loop 11701 1727096134.34404: done getting the remaining hosts for this loop 11701 1727096134.34407: getting the next task for host managed_node3 11701 1727096134.34414: done getting next task for host managed_node3 11701 1727096134.34417: ^ task is: TASK: Set NM profile exist flag based on the profile files 11701 1727096134.34421: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.34424: getting variables 11701 1727096134.34426: in VariableManager get_vars() 11701 1727096134.34469: Calling all_inventory to load vars for managed_node3 11701 1727096134.34472: Calling groups_inventory to load vars for managed_node3 11701 1727096134.34475: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.34486: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.34489: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.34491: Calling groups_plugins_play to load vars for managed_node3 11701 1727096134.36029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096134.40818: done with get_vars() 11701 1727096134.40837: done getting variables 11701 1727096134.40875: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:34 -0400 (0:00:00.455) 0:00:18.373 ****** 11701 1727096134.40897: entering _queue_task() for managed_node3/set_fact 11701 1727096134.41156: worker is 1 (out of 1 available) 11701 1727096134.41174: exiting _queue_task() for managed_node3/set_fact 11701 1727096134.41185: done queuing things up, now waiting for results queue to drain 11701 1727096134.41187: waiting for pending results... 11701 1727096134.41366: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11701 1727096134.41448: in run() - task 0afff68d-5257-a05c-c957-0000000003fa 11701 1727096134.41466: variable 'ansible_search_path' from source: unknown 11701 1727096134.41471: variable 'ansible_search_path' from source: unknown 11701 1727096134.41499: calling self._execute() 11701 1727096134.41579: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.41584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.41594: variable 'omit' from source: magic vars 11701 1727096134.41892: variable 'ansible_distribution_major_version' from source: facts 11701 1727096134.41903: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096134.41988: variable 'profile_stat' from source: set_fact 11701 1727096134.41999: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096134.42002: when evaluation is False, skipping this task 11701 1727096134.42004: _execute() done 11701 1727096134.42007: dumping result to json 11701 1727096134.42009: done dumping result, returning 11701 1727096134.42016: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-a05c-c957-0000000003fa] 11701 1727096134.42018: sending task result for task 0afff68d-5257-a05c-c957-0000000003fa 11701 1727096134.42097: done sending task result for task 0afff68d-5257-a05c-c957-0000000003fa 11701 1727096134.42100: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096134.42143: no more pending results, returning what we have 11701 1727096134.42147: results queue empty 11701 1727096134.42148: checking for any_errors_fatal 11701 1727096134.42158: done checking for any_errors_fatal 11701 1727096134.42159: checking for max_fail_percentage 11701 1727096134.42161: done checking for max_fail_percentage 11701 1727096134.42161: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.42162: done checking to see if all hosts have failed 11701 1727096134.42163: getting the remaining hosts for this loop 11701 1727096134.42164: done getting the remaining hosts for this loop 11701 1727096134.42169: getting the next task for host managed_node3 11701 1727096134.42175: done getting next task for host managed_node3 11701 1727096134.42178: ^ task is: TASK: Get NM profile info 11701 1727096134.42183: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.42191: getting variables 11701 1727096134.42192: in VariableManager get_vars() 11701 1727096134.42234: Calling all_inventory to load vars for managed_node3 11701 1727096134.42236: Calling groups_inventory to load vars for managed_node3 11701 1727096134.42238: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.42250: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.42253: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.42255: Calling groups_plugins_play to load vars for managed_node3 11701 1727096134.43042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096134.43907: done with get_vars() 11701 1727096134.43925: done getting variables 11701 1727096134.43971: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:34 -0400 (0:00:00.030) 0:00:18.404 ****** 11701 1727096134.43996: entering _queue_task() for managed_node3/shell 11701 1727096134.44245: worker is 1 (out of 1 available) 11701 1727096134.44262: exiting _queue_task() for managed_node3/shell 11701 1727096134.44276: done queuing things up, now waiting for results queue to drain 11701 1727096134.44278: waiting for pending results... 11701 1727096134.44454: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11701 1727096134.44539: in run() - task 0afff68d-5257-a05c-c957-0000000003fb 11701 1727096134.44560: variable 'ansible_search_path' from source: unknown 11701 1727096134.44564: variable 'ansible_search_path' from source: unknown 11701 1727096134.44593: calling self._execute() 11701 1727096134.44668: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.44673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.44682: variable 'omit' from source: magic vars 11701 1727096134.44953: variable 'ansible_distribution_major_version' from source: facts 11701 1727096134.44962: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096134.44969: variable 'omit' from source: magic vars 11701 1727096134.45000: variable 'omit' from source: magic vars 11701 1727096134.45072: variable 'profile' from source: include params 11701 1727096134.45075: variable 'item' from source: include params 11701 1727096134.45119: variable 'item' from source: include params 11701 1727096134.45133: variable 'omit' from source: magic vars 11701 1727096134.45171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096134.45197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096134.45214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096134.45228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096134.45238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096134.45264: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096134.45269: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.45271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.45357: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096134.45360: Set connection var ansible_timeout to 10 11701 1727096134.45363: Set connection var ansible_shell_type to sh 11701 1727096134.45365: Set connection var ansible_shell_executable to /bin/sh 11701 1727096134.45371: Set connection var ansible_connection to ssh 11701 1727096134.45379: Set connection var ansible_pipelining to False 11701 1727096134.45399: variable 'ansible_shell_executable' from source: unknown 11701 1727096134.45402: variable 'ansible_connection' from source: unknown 11701 1727096134.45404: variable 'ansible_module_compression' from source: unknown 11701 1727096134.45406: variable 'ansible_shell_type' from source: unknown 11701 1727096134.45409: variable 'ansible_shell_executable' from source: unknown 11701 1727096134.45411: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.45413: variable 'ansible_pipelining' from source: unknown 11701 1727096134.45416: variable 'ansible_timeout' from source: unknown 11701 1727096134.45421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.45523: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096134.45531: variable 'omit' from source: magic vars 11701 1727096134.45536: starting attempt loop 11701 1727096134.45539: running the handler 11701 1727096134.45547: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096134.45564: _low_level_execute_command(): starting 11701 1727096134.45572: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096134.46066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.46096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.46100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.46103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.46157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.46160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.46162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.46212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.47882: stdout chunk (state=3): >>>/root <<< 11701 1727096134.47977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.48011: stderr chunk (state=3): >>><<< 11701 1727096134.48015: stdout chunk (state=3): >>><<< 11701 1727096134.48036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.48049: _low_level_execute_command(): starting 11701 1727096134.48057: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806 `" && echo ansible-tmp-1727096134.4803758-12546-82951637840806="` echo /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806 `" ) && sleep 0' 11701 1727096134.48517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.48520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096134.48525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.48527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.48530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.48585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.48592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.48595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.48628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.50580: stdout chunk (state=3): >>>ansible-tmp-1727096134.4803758-12546-82951637840806=/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806 <<< 11701 1727096134.50687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.50743: stderr chunk (state=3): >>><<< 11701 1727096134.50763: stdout chunk (state=3): >>><<< 11701 1727096134.50973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096134.4803758-12546-82951637840806=/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.50978: variable 'ansible_module_compression' from source: unknown 11701 1727096134.50981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096134.50983: variable 'ansible_facts' from source: unknown 11701 1727096134.51030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py 11701 1727096134.51237: Sending initial data 11701 1727096134.51241: Sent initial data (155 bytes) 11701 1727096134.51848: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096134.51882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.51995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.52020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.52096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.53688: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096134.53716: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096134.53780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096134.53850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpch6s9wp1 /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py <<< 11701 1727096134.53853: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py" <<< 11701 1727096134.53893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpch6s9wp1" to remote "/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py" <<< 11701 1727096134.54402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.54443: stderr chunk (state=3): >>><<< 11701 1727096134.54447: stdout chunk (state=3): >>><<< 11701 1727096134.54494: done transferring module to remote 11701 1727096134.54503: _low_level_execute_command(): starting 11701 1727096134.54509: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/ /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py && sleep 0' 11701 1727096134.54928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.54963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096134.54974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.54977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.54979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096134.54981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.55025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.55036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.55042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.55070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.56861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.56890: stderr chunk (state=3): >>><<< 11701 1727096134.56893: stdout chunk (state=3): >>><<< 11701 1727096134.56908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.56911: _low_level_execute_command(): starting 11701 1727096134.56916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/AnsiballZ_command.py && sleep 0' 11701 1727096134.57363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.57366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096134.57370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.57373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.57375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.57427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096134.57434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.57472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.74995: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:55:34.724831", "end": "2024-09-23 08:55:34.745678", "delta": "0:00:00.020847", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096134.76487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.76501: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 11701 1727096134.76560: stderr chunk (state=3): >>><<< 11701 1727096134.76580: stdout chunk (state=3): >>><<< 11701 1727096134.76605: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:55:34.724831", "end": "2024-09-23 08:55:34.745678", "delta": "0:00:00.020847", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096134.76655: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096134.76678: _low_level_execute_command(): starting 11701 1727096134.76687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096134.4803758-12546-82951637840806/ > /dev/null 2>&1 && sleep 0' 11701 1727096134.77322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096134.77338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096134.77352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096134.77383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096134.77401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096134.77481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096134.77508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096134.77574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096134.79486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096134.79514: stdout chunk (state=3): >>><<< 11701 1727096134.79518: stderr chunk (state=3): >>><<< 11701 1727096134.79535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096134.79577: handler run complete 11701 1727096134.79581: Evaluated conditional (False): False 11701 1727096134.79610: attempt loop complete, returning result 11701 1727096134.79619: _execute() done 11701 1727096134.79626: dumping result to json 11701 1727096134.79636: done dumping result, returning 11701 1727096134.79674: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-a05c-c957-0000000003fb] 11701 1727096134.79677: sending task result for task 0afff68d-5257-a05c-c957-0000000003fb 11701 1727096134.79990: done sending task result for task 0afff68d-5257-a05c-c957-0000000003fb 11701 1727096134.79993: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020847", "end": "2024-09-23 08:55:34.745678", "rc": 0, "start": "2024-09-23 08:55:34.724831" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11701 1727096134.80063: no more pending results, returning what we have 11701 1727096134.80066: results queue empty 11701 1727096134.80069: checking for any_errors_fatal 11701 1727096134.80072: done checking for any_errors_fatal 11701 1727096134.80073: checking for max_fail_percentage 11701 1727096134.80075: done checking for max_fail_percentage 11701 1727096134.80075: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.80076: done checking to see if all hosts have failed 11701 1727096134.80077: getting the remaining hosts for this loop 11701 1727096134.80078: done getting the remaining hosts for this loop 11701 1727096134.80081: getting the next task for host managed_node3 11701 1727096134.80087: done getting next task for host managed_node3 11701 1727096134.80090: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096134.80094: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.80097: getting variables 11701 1727096134.80098: in VariableManager get_vars() 11701 1727096134.80140: Calling all_inventory to load vars for managed_node3 11701 1727096134.80142: Calling groups_inventory to load vars for managed_node3 11701 1727096134.80145: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.80158: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.80160: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.80163: Calling groups_plugins_play to load vars for managed_node3 11701 1727096134.81872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096134.83494: done with get_vars() 11701 1727096134.83519: done getting variables 11701 1727096134.83592: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:34 -0400 (0:00:00.396) 0:00:18.800 ****** 11701 1727096134.83625: entering _queue_task() for managed_node3/set_fact 11701 1727096134.84156: worker is 1 (out of 1 available) 11701 1727096134.84171: exiting _queue_task() for managed_node3/set_fact 11701 1727096134.84295: done queuing things up, now waiting for results queue to drain 11701 1727096134.84297: waiting for pending results... 11701 1727096134.84477: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096134.84619: in run() - task 0afff68d-5257-a05c-c957-0000000003fc 11701 1727096134.84729: variable 'ansible_search_path' from source: unknown 11701 1727096134.84735: variable 'ansible_search_path' from source: unknown 11701 1727096134.84738: calling self._execute() 11701 1727096134.84802: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.84817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.84840: variable 'omit' from source: magic vars 11701 1727096134.85230: variable 'ansible_distribution_major_version' from source: facts 11701 1727096134.85246: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096134.85397: variable 'nm_profile_exists' from source: set_fact 11701 1727096134.85417: Evaluated conditional (nm_profile_exists.rc == 0): True 11701 1727096134.85429: variable 'omit' from source: magic vars 11701 1727096134.85484: variable 'omit' from source: magic vars 11701 1727096134.85526: variable 'omit' from source: magic vars 11701 1727096134.85574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096134.85673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096134.85676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096134.85678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096134.85680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096134.85718: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096134.85726: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.85732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.85843: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096134.85856: Set connection var ansible_timeout to 10 11701 1727096134.85863: Set connection var ansible_shell_type to sh 11701 1727096134.85874: Set connection var ansible_shell_executable to /bin/sh 11701 1727096134.85880: Set connection var ansible_connection to ssh 11701 1727096134.85893: Set connection var ansible_pipelining to False 11701 1727096134.85918: variable 'ansible_shell_executable' from source: unknown 11701 1727096134.85975: variable 'ansible_connection' from source: unknown 11701 1727096134.85978: variable 'ansible_module_compression' from source: unknown 11701 1727096134.85980: variable 'ansible_shell_type' from source: unknown 11701 1727096134.85983: variable 'ansible_shell_executable' from source: unknown 11701 1727096134.85985: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.85986: variable 'ansible_pipelining' from source: unknown 11701 1727096134.85989: variable 'ansible_timeout' from source: unknown 11701 1727096134.85991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.86116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096134.86134: variable 'omit' from source: magic vars 11701 1727096134.86147: starting attempt loop 11701 1727096134.86157: running the handler 11701 1727096134.86173: handler run complete 11701 1727096134.86256: attempt loop complete, returning result 11701 1727096134.86259: _execute() done 11701 1727096134.86262: dumping result to json 11701 1727096134.86264: done dumping result, returning 11701 1727096134.86266: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-a05c-c957-0000000003fc] 11701 1727096134.86272: sending task result for task 0afff68d-5257-a05c-c957-0000000003fc ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11701 1727096134.86415: no more pending results, returning what we have 11701 1727096134.86418: results queue empty 11701 1727096134.86419: checking for any_errors_fatal 11701 1727096134.86426: done checking for any_errors_fatal 11701 1727096134.86426: checking for max_fail_percentage 11701 1727096134.86428: done checking for max_fail_percentage 11701 1727096134.86429: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.86430: done checking to see if all hosts have failed 11701 1727096134.86431: getting the remaining hosts for this loop 11701 1727096134.86432: done getting the remaining hosts for this loop 11701 1727096134.86436: getting the next task for host managed_node3 11701 1727096134.86445: done getting next task for host managed_node3 11701 1727096134.86448: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096134.86455: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.86460: getting variables 11701 1727096134.86462: in VariableManager get_vars() 11701 1727096134.86614: Calling all_inventory to load vars for managed_node3 11701 1727096134.86616: Calling groups_inventory to load vars for managed_node3 11701 1727096134.86619: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.86625: done sending task result for task 0afff68d-5257-a05c-c957-0000000003fc 11701 1727096134.86629: WORKER PROCESS EXITING 11701 1727096134.86639: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.86642: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.86645: Calling groups_plugins_play to load vars for managed_node3 11701 1727096134.88163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096134.89784: done with get_vars() 11701 1727096134.89814: done getting variables 11701 1727096134.89879: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096134.90018: variable 'profile' from source: include params 11701 1727096134.90022: variable 'item' from source: include params 11701 1727096134.90083: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:34 -0400 (0:00:00.064) 0:00:18.865 ****** 11701 1727096134.90125: entering _queue_task() for managed_node3/command 11701 1727096134.90583: worker is 1 (out of 1 available) 11701 1727096134.90594: exiting _queue_task() for managed_node3/command 11701 1727096134.90604: done queuing things up, now waiting for results queue to drain 11701 1727096134.90605: waiting for pending results... 11701 1727096134.90778: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11701 1727096134.90910: in run() - task 0afff68d-5257-a05c-c957-0000000003fe 11701 1727096134.90929: variable 'ansible_search_path' from source: unknown 11701 1727096134.90940: variable 'ansible_search_path' from source: unknown 11701 1727096134.90984: calling self._execute() 11701 1727096134.91090: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.91101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.91119: variable 'omit' from source: magic vars 11701 1727096134.91500: variable 'ansible_distribution_major_version' from source: facts 11701 1727096134.91521: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096134.91671: variable 'profile_stat' from source: set_fact 11701 1727096134.91701: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096134.91704: when evaluation is False, skipping this task 11701 1727096134.91707: _execute() done 11701 1727096134.91811: dumping result to json 11701 1727096134.91814: done dumping result, returning 11701 1727096134.91817: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-a05c-c957-0000000003fe] 11701 1727096134.91820: sending task result for task 0afff68d-5257-a05c-c957-0000000003fe skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096134.91947: no more pending results, returning what we have 11701 1727096134.91954: results queue empty 11701 1727096134.91955: checking for any_errors_fatal 11701 1727096134.91960: done checking for any_errors_fatal 11701 1727096134.91961: checking for max_fail_percentage 11701 1727096134.91963: done checking for max_fail_percentage 11701 1727096134.91964: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.91965: done checking to see if all hosts have failed 11701 1727096134.91966: getting the remaining hosts for this loop 11701 1727096134.91969: done getting the remaining hosts for this loop 11701 1727096134.91973: getting the next task for host managed_node3 11701 1727096134.91980: done getting next task for host managed_node3 11701 1727096134.91983: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096134.91987: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.91992: getting variables 11701 1727096134.91994: in VariableManager get_vars() 11701 1727096134.92037: Calling all_inventory to load vars for managed_node3 11701 1727096134.92040: Calling groups_inventory to load vars for managed_node3 11701 1727096134.92042: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.92058: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.92060: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.92063: Calling groups_plugins_play to load vars for managed_node3 11701 1727096134.92881: done sending task result for task 0afff68d-5257-a05c-c957-0000000003fe 11701 1727096134.92884: WORKER PROCESS EXITING 11701 1727096134.93894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096134.95703: done with get_vars() 11701 1727096134.95727: done getting variables 11701 1727096134.95898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096134.96019: variable 'profile' from source: include params 11701 1727096134.96023: variable 'item' from source: include params 11701 1727096134.96103: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:34 -0400 (0:00:00.060) 0:00:18.925 ****** 11701 1727096134.96136: entering _queue_task() for managed_node3/set_fact 11701 1727096134.96498: worker is 1 (out of 1 available) 11701 1727096134.96514: exiting _queue_task() for managed_node3/set_fact 11701 1727096134.96526: done queuing things up, now waiting for results queue to drain 11701 1727096134.96528: waiting for pending results... 11701 1727096134.97013: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11701 1727096134.97163: in run() - task 0afff68d-5257-a05c-c957-0000000003ff 11701 1727096134.97186: variable 'ansible_search_path' from source: unknown 11701 1727096134.97193: variable 'ansible_search_path' from source: unknown 11701 1727096134.97234: calling self._execute() 11701 1727096134.97349: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096134.97382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096134.97386: variable 'omit' from source: magic vars 11701 1727096134.97772: variable 'ansible_distribution_major_version' from source: facts 11701 1727096134.97818: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096134.97916: variable 'profile_stat' from source: set_fact 11701 1727096134.97943: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096134.97955: when evaluation is False, skipping this task 11701 1727096134.97972: _execute() done 11701 1727096134.97975: dumping result to json 11701 1727096134.98000: done dumping result, returning 11701 1727096134.98003: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-a05c-c957-0000000003ff] 11701 1727096134.98006: sending task result for task 0afff68d-5257-a05c-c957-0000000003ff 11701 1727096134.98257: done sending task result for task 0afff68d-5257-a05c-c957-0000000003ff 11701 1727096134.98261: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096134.98314: no more pending results, returning what we have 11701 1727096134.98319: results queue empty 11701 1727096134.98320: checking for any_errors_fatal 11701 1727096134.98327: done checking for any_errors_fatal 11701 1727096134.98327: checking for max_fail_percentage 11701 1727096134.98330: done checking for max_fail_percentage 11701 1727096134.98331: checking to see if all hosts have failed and the running result is not ok 11701 1727096134.98332: done checking to see if all hosts have failed 11701 1727096134.98333: getting the remaining hosts for this loop 11701 1727096134.98334: done getting the remaining hosts for this loop 11701 1727096134.98337: getting the next task for host managed_node3 11701 1727096134.98345: done getting next task for host managed_node3 11701 1727096134.98348: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11701 1727096134.98356: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096134.98561: getting variables 11701 1727096134.98563: in VariableManager get_vars() 11701 1727096134.98602: Calling all_inventory to load vars for managed_node3 11701 1727096134.98605: Calling groups_inventory to load vars for managed_node3 11701 1727096134.98607: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096134.98617: Calling all_plugins_play to load vars for managed_node3 11701 1727096134.98620: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096134.98623: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.00874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.04385: done with get_vars() 11701 1727096135.04411: done getting variables 11701 1727096135.04479: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096135.04803: variable 'profile' from source: include params 11701 1727096135.04808: variable 'item' from source: include params 11701 1727096135.04870: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:35 -0400 (0:00:00.087) 0:00:19.013 ****** 11701 1727096135.04903: entering _queue_task() for managed_node3/command 11701 1727096135.05647: worker is 1 (out of 1 available) 11701 1727096135.05659: exiting _queue_task() for managed_node3/command 11701 1727096135.05672: done queuing things up, now waiting for results queue to drain 11701 1727096135.05673: waiting for pending results... 11701 1727096135.06230: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 11701 1727096135.06541: in run() - task 0afff68d-5257-a05c-c957-000000000400 11701 1727096135.06545: variable 'ansible_search_path' from source: unknown 11701 1727096135.06548: variable 'ansible_search_path' from source: unknown 11701 1727096135.06553: calling self._execute() 11701 1727096135.06724: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.06767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.06975: variable 'omit' from source: magic vars 11701 1727096135.07542: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.07733: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.07875: variable 'profile_stat' from source: set_fact 11701 1727096135.07893: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096135.07901: when evaluation is False, skipping this task 11701 1727096135.07907: _execute() done 11701 1727096135.07956: dumping result to json 11701 1727096135.07963: done dumping result, returning 11701 1727096135.07976: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-a05c-c957-000000000400] 11701 1727096135.07984: sending task result for task 0afff68d-5257-a05c-c957-000000000400 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096135.08153: no more pending results, returning what we have 11701 1727096135.08157: results queue empty 11701 1727096135.08159: checking for any_errors_fatal 11701 1727096135.08166: done checking for any_errors_fatal 11701 1727096135.08167: checking for max_fail_percentage 11701 1727096135.08170: done checking for max_fail_percentage 11701 1727096135.08171: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.08172: done checking to see if all hosts have failed 11701 1727096135.08173: getting the remaining hosts for this loop 11701 1727096135.08174: done getting the remaining hosts for this loop 11701 1727096135.08178: getting the next task for host managed_node3 11701 1727096135.08185: done getting next task for host managed_node3 11701 1727096135.08188: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11701 1727096135.08193: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.08197: getting variables 11701 1727096135.08199: in VariableManager get_vars() 11701 1727096135.08243: Calling all_inventory to load vars for managed_node3 11701 1727096135.08245: Calling groups_inventory to load vars for managed_node3 11701 1727096135.08248: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.08262: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.08265: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.08472: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.09181: done sending task result for task 0afff68d-5257-a05c-c957-000000000400 11701 1727096135.09184: WORKER PROCESS EXITING 11701 1727096135.10764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.13937: done with get_vars() 11701 1727096135.13966: done getting variables 11701 1727096135.14128: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096135.14274: variable 'profile' from source: include params 11701 1727096135.14278: variable 'item' from source: include params 11701 1727096135.14381: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:35 -0400 (0:00:00.095) 0:00:19.108 ****** 11701 1727096135.14435: entering _queue_task() for managed_node3/set_fact 11701 1727096135.14810: worker is 1 (out of 1 available) 11701 1727096135.14822: exiting _queue_task() for managed_node3/set_fact 11701 1727096135.14832: done queuing things up, now waiting for results queue to drain 11701 1727096135.14833: waiting for pending results... 11701 1727096135.15136: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11701 1727096135.15246: in run() - task 0afff68d-5257-a05c-c957-000000000401 11701 1727096135.15262: variable 'ansible_search_path' from source: unknown 11701 1727096135.15266: variable 'ansible_search_path' from source: unknown 11701 1727096135.15302: calling self._execute() 11701 1727096135.15406: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.15410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.15429: variable 'omit' from source: magic vars 11701 1727096135.15811: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.15823: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.15959: variable 'profile_stat' from source: set_fact 11701 1727096135.15982: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096135.15986: when evaluation is False, skipping this task 11701 1727096135.15988: _execute() done 11701 1727096135.15991: dumping result to json 11701 1727096135.15993: done dumping result, returning 11701 1727096135.16000: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-a05c-c957-000000000401] 11701 1727096135.16005: sending task result for task 0afff68d-5257-a05c-c957-000000000401 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096135.16148: no more pending results, returning what we have 11701 1727096135.16158: results queue empty 11701 1727096135.16159: checking for any_errors_fatal 11701 1727096135.16173: done checking for any_errors_fatal 11701 1727096135.16176: checking for max_fail_percentage 11701 1727096135.16178: done checking for max_fail_percentage 11701 1727096135.16179: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.16180: done checking to see if all hosts have failed 11701 1727096135.16181: getting the remaining hosts for this loop 11701 1727096135.16182: done getting the remaining hosts for this loop 11701 1727096135.16187: getting the next task for host managed_node3 11701 1727096135.16198: done getting next task for host managed_node3 11701 1727096135.16201: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11701 1727096135.16205: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.16210: getting variables 11701 1727096135.16212: in VariableManager get_vars() 11701 1727096135.16261: Calling all_inventory to load vars for managed_node3 11701 1727096135.16263: Calling groups_inventory to load vars for managed_node3 11701 1727096135.16266: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.16699: done sending task result for task 0afff68d-5257-a05c-c957-000000000401 11701 1727096135.16715: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.16722: WORKER PROCESS EXITING 11701 1727096135.16736: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.16836: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.19169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.20836: done with get_vars() 11701 1727096135.20876: done getting variables 11701 1727096135.20931: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096135.21055: variable 'profile' from source: include params 11701 1727096135.21064: variable 'item' from source: include params 11701 1727096135.21126: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:35 -0400 (0:00:00.067) 0:00:19.176 ****** 11701 1727096135.21162: entering _queue_task() for managed_node3/assert 11701 1727096135.21693: worker is 1 (out of 1 available) 11701 1727096135.21704: exiting _queue_task() for managed_node3/assert 11701 1727096135.21713: done queuing things up, now waiting for results queue to drain 11701 1727096135.21715: waiting for pending results... 11701 1727096135.21985: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 11701 1727096135.21990: in run() - task 0afff68d-5257-a05c-c957-000000000267 11701 1727096135.21994: variable 'ansible_search_path' from source: unknown 11701 1727096135.21996: variable 'ansible_search_path' from source: unknown 11701 1727096135.22028: calling self._execute() 11701 1727096135.22136: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.22139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.22474: variable 'omit' from source: magic vars 11701 1727096135.22551: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.22566: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.22579: variable 'omit' from source: magic vars 11701 1727096135.22634: variable 'omit' from source: magic vars 11701 1727096135.22742: variable 'profile' from source: include params 11701 1727096135.22746: variable 'item' from source: include params 11701 1727096135.22819: variable 'item' from source: include params 11701 1727096135.22840: variable 'omit' from source: magic vars 11701 1727096135.22888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096135.22930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096135.22952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096135.22974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.22989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.23019: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096135.23030: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.23035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.23272: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096135.23276: Set connection var ansible_timeout to 10 11701 1727096135.23278: Set connection var ansible_shell_type to sh 11701 1727096135.23280: Set connection var ansible_shell_executable to /bin/sh 11701 1727096135.23282: Set connection var ansible_connection to ssh 11701 1727096135.23285: Set connection var ansible_pipelining to False 11701 1727096135.23287: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.23289: variable 'ansible_connection' from source: unknown 11701 1727096135.23291: variable 'ansible_module_compression' from source: unknown 11701 1727096135.23293: variable 'ansible_shell_type' from source: unknown 11701 1727096135.23295: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.23298: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.23300: variable 'ansible_pipelining' from source: unknown 11701 1727096135.23302: variable 'ansible_timeout' from source: unknown 11701 1727096135.23304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.23371: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096135.23383: variable 'omit' from source: magic vars 11701 1727096135.23388: starting attempt loop 11701 1727096135.23391: running the handler 11701 1727096135.23509: variable 'lsr_net_profile_exists' from source: set_fact 11701 1727096135.23672: Evaluated conditional (lsr_net_profile_exists): True 11701 1727096135.23675: handler run complete 11701 1727096135.23677: attempt loop complete, returning result 11701 1727096135.23679: _execute() done 11701 1727096135.23685: dumping result to json 11701 1727096135.23687: done dumping result, returning 11701 1727096135.23689: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0afff68d-5257-a05c-c957-000000000267] 11701 1727096135.23691: sending task result for task 0afff68d-5257-a05c-c957-000000000267 11701 1727096135.23752: done sending task result for task 0afff68d-5257-a05c-c957-000000000267 11701 1727096135.23754: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096135.23837: no more pending results, returning what we have 11701 1727096135.23841: results queue empty 11701 1727096135.23842: checking for any_errors_fatal 11701 1727096135.23849: done checking for any_errors_fatal 11701 1727096135.23852: checking for max_fail_percentage 11701 1727096135.23854: done checking for max_fail_percentage 11701 1727096135.23856: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.23857: done checking to see if all hosts have failed 11701 1727096135.23857: getting the remaining hosts for this loop 11701 1727096135.23858: done getting the remaining hosts for this loop 11701 1727096135.23862: getting the next task for host managed_node3 11701 1727096135.23870: done getting next task for host managed_node3 11701 1727096135.23874: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11701 1727096135.23878: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.23882: getting variables 11701 1727096135.23884: in VariableManager get_vars() 11701 1727096135.23931: Calling all_inventory to load vars for managed_node3 11701 1727096135.23933: Calling groups_inventory to load vars for managed_node3 11701 1727096135.23936: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.23948: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.23954: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.23957: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.25528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.27226: done with get_vars() 11701 1727096135.27260: done getting variables 11701 1727096135.27324: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096135.27458: variable 'profile' from source: include params 11701 1727096135.27462: variable 'item' from source: include params 11701 1727096135.27518: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:35 -0400 (0:00:00.063) 0:00:19.240 ****** 11701 1727096135.27564: entering _queue_task() for managed_node3/assert 11701 1727096135.27924: worker is 1 (out of 1 available) 11701 1727096135.27938: exiting _queue_task() for managed_node3/assert 11701 1727096135.27950: done queuing things up, now waiting for results queue to drain 11701 1727096135.27953: waiting for pending results... 11701 1727096135.28247: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11701 1727096135.28353: in run() - task 0afff68d-5257-a05c-c957-000000000268 11701 1727096135.28673: variable 'ansible_search_path' from source: unknown 11701 1727096135.28677: variable 'ansible_search_path' from source: unknown 11701 1727096135.28680: calling self._execute() 11701 1727096135.28683: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.28685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.28688: variable 'omit' from source: magic vars 11701 1727096135.28912: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.28924: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.28930: variable 'omit' from source: magic vars 11701 1727096135.28976: variable 'omit' from source: magic vars 11701 1727096135.29079: variable 'profile' from source: include params 11701 1727096135.29083: variable 'item' from source: include params 11701 1727096135.29145: variable 'item' from source: include params 11701 1727096135.29175: variable 'omit' from source: magic vars 11701 1727096135.29215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096135.29250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096135.29281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096135.29301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.29311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.29341: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096135.29345: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.29348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.29572: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096135.29576: Set connection var ansible_timeout to 10 11701 1727096135.29579: Set connection var ansible_shell_type to sh 11701 1727096135.29581: Set connection var ansible_shell_executable to /bin/sh 11701 1727096135.29583: Set connection var ansible_connection to ssh 11701 1727096135.29586: Set connection var ansible_pipelining to False 11701 1727096135.29587: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.29589: variable 'ansible_connection' from source: unknown 11701 1727096135.29593: variable 'ansible_module_compression' from source: unknown 11701 1727096135.29596: variable 'ansible_shell_type' from source: unknown 11701 1727096135.29599: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.29602: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.29604: variable 'ansible_pipelining' from source: unknown 11701 1727096135.29607: variable 'ansible_timeout' from source: unknown 11701 1727096135.29610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.29684: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096135.29695: variable 'omit' from source: magic vars 11701 1727096135.29705: starting attempt loop 11701 1727096135.29709: running the handler 11701 1727096135.29972: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11701 1727096135.29976: Evaluated conditional (lsr_net_profile_ansible_managed): True 11701 1727096135.29978: handler run complete 11701 1727096135.29980: attempt loop complete, returning result 11701 1727096135.29981: _execute() done 11701 1727096135.29983: dumping result to json 11701 1727096135.29985: done dumping result, returning 11701 1727096135.29987: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0afff68d-5257-a05c-c957-000000000268] 11701 1727096135.29989: sending task result for task 0afff68d-5257-a05c-c957-000000000268 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096135.30101: no more pending results, returning what we have 11701 1727096135.30105: results queue empty 11701 1727096135.30106: checking for any_errors_fatal 11701 1727096135.30111: done checking for any_errors_fatal 11701 1727096135.30112: checking for max_fail_percentage 11701 1727096135.30113: done checking for max_fail_percentage 11701 1727096135.30114: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.30115: done checking to see if all hosts have failed 11701 1727096135.30116: getting the remaining hosts for this loop 11701 1727096135.30117: done getting the remaining hosts for this loop 11701 1727096135.30121: getting the next task for host managed_node3 11701 1727096135.30128: done getting next task for host managed_node3 11701 1727096135.30132: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11701 1727096135.30136: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.30140: getting variables 11701 1727096135.30142: in VariableManager get_vars() 11701 1727096135.30189: Calling all_inventory to load vars for managed_node3 11701 1727096135.30192: Calling groups_inventory to load vars for managed_node3 11701 1727096135.30194: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.30208: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.30212: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.30215: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.30880: done sending task result for task 0afff68d-5257-a05c-c957-000000000268 11701 1727096135.30884: WORKER PROCESS EXITING 11701 1727096135.33207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.36362: done with get_vars() 11701 1727096135.36710: done getting variables 11701 1727096135.36878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096135.37042: variable 'profile' from source: include params 11701 1727096135.37046: variable 'item' from source: include params 11701 1727096135.37228: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:35 -0400 (0:00:00.097) 0:00:19.337 ****** 11701 1727096135.37272: entering _queue_task() for managed_node3/assert 11701 1727096135.38072: worker is 1 (out of 1 available) 11701 1727096135.38087: exiting _queue_task() for managed_node3/assert 11701 1727096135.38191: done queuing things up, now waiting for results queue to drain 11701 1727096135.38193: waiting for pending results... 11701 1727096135.38453: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 11701 1727096135.38678: in run() - task 0afff68d-5257-a05c-c957-000000000269 11701 1727096135.38974: variable 'ansible_search_path' from source: unknown 11701 1727096135.38978: variable 'ansible_search_path' from source: unknown 11701 1727096135.38981: calling self._execute() 11701 1727096135.38983: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.38986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.38989: variable 'omit' from source: magic vars 11701 1727096135.39705: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.40073: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.40077: variable 'omit' from source: magic vars 11701 1727096135.40079: variable 'omit' from source: magic vars 11701 1727096135.40139: variable 'profile' from source: include params 11701 1727096135.40150: variable 'item' from source: include params 11701 1727096135.40219: variable 'item' from source: include params 11701 1727096135.40572: variable 'omit' from source: magic vars 11701 1727096135.40576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096135.40579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096135.40603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096135.40623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.40638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.40672: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096135.40679: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.40685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.41172: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096135.41176: Set connection var ansible_timeout to 10 11701 1727096135.41179: Set connection var ansible_shell_type to sh 11701 1727096135.41182: Set connection var ansible_shell_executable to /bin/sh 11701 1727096135.41185: Set connection var ansible_connection to ssh 11701 1727096135.41188: Set connection var ansible_pipelining to False 11701 1727096135.41190: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.41193: variable 'ansible_connection' from source: unknown 11701 1727096135.41195: variable 'ansible_module_compression' from source: unknown 11701 1727096135.41198: variable 'ansible_shell_type' from source: unknown 11701 1727096135.41200: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.41203: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.41205: variable 'ansible_pipelining' from source: unknown 11701 1727096135.41208: variable 'ansible_timeout' from source: unknown 11701 1727096135.41211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.41376: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096135.41395: variable 'omit' from source: magic vars 11701 1727096135.41406: starting attempt loop 11701 1727096135.41412: running the handler 11701 1727096135.41530: variable 'lsr_net_profile_fingerprint' from source: set_fact 11701 1727096135.41873: Evaluated conditional (lsr_net_profile_fingerprint): True 11701 1727096135.41877: handler run complete 11701 1727096135.41879: attempt loop complete, returning result 11701 1727096135.41881: _execute() done 11701 1727096135.41883: dumping result to json 11701 1727096135.41885: done dumping result, returning 11701 1727096135.41887: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0afff68d-5257-a05c-c957-000000000269] 11701 1727096135.41890: sending task result for task 0afff68d-5257-a05c-c957-000000000269 11701 1727096135.41954: done sending task result for task 0afff68d-5257-a05c-c957-000000000269 11701 1727096135.41957: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096135.42017: no more pending results, returning what we have 11701 1727096135.42021: results queue empty 11701 1727096135.42022: checking for any_errors_fatal 11701 1727096135.42028: done checking for any_errors_fatal 11701 1727096135.42029: checking for max_fail_percentage 11701 1727096135.42030: done checking for max_fail_percentage 11701 1727096135.42031: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.42032: done checking to see if all hosts have failed 11701 1727096135.42033: getting the remaining hosts for this loop 11701 1727096135.42034: done getting the remaining hosts for this loop 11701 1727096135.42037: getting the next task for host managed_node3 11701 1727096135.42048: done getting next task for host managed_node3 11701 1727096135.42053: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11701 1727096135.42057: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.42061: getting variables 11701 1727096135.42062: in VariableManager get_vars() 11701 1727096135.42105: Calling all_inventory to load vars for managed_node3 11701 1727096135.42108: Calling groups_inventory to load vars for managed_node3 11701 1727096135.42110: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.42121: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.42123: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.42126: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.44302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.47477: done with get_vars() 11701 1727096135.47512: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:55:35 -0400 (0:00:00.103) 0:00:19.440 ****** 11701 1727096135.47626: entering _queue_task() for managed_node3/include_tasks 11701 1727096135.48043: worker is 1 (out of 1 available) 11701 1727096135.48056: exiting _queue_task() for managed_node3/include_tasks 11701 1727096135.48075: done queuing things up, now waiting for results queue to drain 11701 1727096135.48077: waiting for pending results... 11701 1727096135.48356: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11701 1727096135.48482: in run() - task 0afff68d-5257-a05c-c957-00000000026d 11701 1727096135.48505: variable 'ansible_search_path' from source: unknown 11701 1727096135.48513: variable 'ansible_search_path' from source: unknown 11701 1727096135.48557: calling self._execute() 11701 1727096135.48659: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.48674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.48692: variable 'omit' from source: magic vars 11701 1727096135.49060: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.49079: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.49090: _execute() done 11701 1727096135.49098: dumping result to json 11701 1727096135.49104: done dumping result, returning 11701 1727096135.49114: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-a05c-c957-00000000026d] 11701 1727096135.49122: sending task result for task 0afff68d-5257-a05c-c957-00000000026d 11701 1727096135.49234: done sending task result for task 0afff68d-5257-a05c-c957-00000000026d 11701 1727096135.49282: no more pending results, returning what we have 11701 1727096135.49287: in VariableManager get_vars() 11701 1727096135.49336: Calling all_inventory to load vars for managed_node3 11701 1727096135.49338: Calling groups_inventory to load vars for managed_node3 11701 1727096135.49341: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.49355: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.49358: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.49360: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.49881: WORKER PROCESS EXITING 11701 1727096135.51046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.52633: done with get_vars() 11701 1727096135.52661: variable 'ansible_search_path' from source: unknown 11701 1727096135.52662: variable 'ansible_search_path' from source: unknown 11701 1727096135.52707: we have included files to process 11701 1727096135.52708: generating all_blocks data 11701 1727096135.52710: done generating all_blocks data 11701 1727096135.52715: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096135.52716: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096135.52719: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11701 1727096135.53620: done processing included file 11701 1727096135.53623: iterating over new_blocks loaded from include file 11701 1727096135.53624: in VariableManager get_vars() 11701 1727096135.53654: done with get_vars() 11701 1727096135.53657: filtering new block on tags 11701 1727096135.53687: done filtering new block on tags 11701 1727096135.53690: in VariableManager get_vars() 11701 1727096135.53711: done with get_vars() 11701 1727096135.53713: filtering new block on tags 11701 1727096135.53736: done filtering new block on tags 11701 1727096135.53738: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11701 1727096135.53744: extending task lists for all hosts with included blocks 11701 1727096135.53928: done extending task lists 11701 1727096135.53929: done processing included files 11701 1727096135.53930: results queue empty 11701 1727096135.53931: checking for any_errors_fatal 11701 1727096135.53934: done checking for any_errors_fatal 11701 1727096135.53935: checking for max_fail_percentage 11701 1727096135.53936: done checking for max_fail_percentage 11701 1727096135.53936: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.53937: done checking to see if all hosts have failed 11701 1727096135.53938: getting the remaining hosts for this loop 11701 1727096135.53939: done getting the remaining hosts for this loop 11701 1727096135.53941: getting the next task for host managed_node3 11701 1727096135.53945: done getting next task for host managed_node3 11701 1727096135.53947: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096135.53950: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.53952: getting variables 11701 1727096135.53953: in VariableManager get_vars() 11701 1727096135.53966: Calling all_inventory to load vars for managed_node3 11701 1727096135.53971: Calling groups_inventory to load vars for managed_node3 11701 1727096135.53973: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.53978: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.53981: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.53983: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.55546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.57473: done with get_vars() 11701 1727096135.57504: done getting variables 11701 1727096135.57558: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:55:35 -0400 (0:00:00.099) 0:00:19.540 ****** 11701 1727096135.57593: entering _queue_task() for managed_node3/set_fact 11701 1727096135.58286: worker is 1 (out of 1 available) 11701 1727096135.58299: exiting _queue_task() for managed_node3/set_fact 11701 1727096135.58312: done queuing things up, now waiting for results queue to drain 11701 1727096135.58313: waiting for pending results... 11701 1727096135.59030: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11701 1727096135.59475: in run() - task 0afff68d-5257-a05c-c957-000000000440 11701 1727096135.59479: variable 'ansible_search_path' from source: unknown 11701 1727096135.59482: variable 'ansible_search_path' from source: unknown 11701 1727096135.59485: calling self._execute() 11701 1727096135.59658: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.59780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.59792: variable 'omit' from source: magic vars 11701 1727096135.60576: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.60585: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.60592: variable 'omit' from source: magic vars 11701 1727096135.60640: variable 'omit' from source: magic vars 11701 1727096135.60966: variable 'omit' from source: magic vars 11701 1727096135.60976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096135.61034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096135.61038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096135.61040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.61043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.61122: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096135.61126: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.61129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.61364: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096135.61371: Set connection var ansible_timeout to 10 11701 1727096135.61374: Set connection var ansible_shell_type to sh 11701 1727096135.61380: Set connection var ansible_shell_executable to /bin/sh 11701 1727096135.61383: Set connection var ansible_connection to ssh 11701 1727096135.61393: Set connection var ansible_pipelining to False 11701 1727096135.61532: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.61536: variable 'ansible_connection' from source: unknown 11701 1727096135.61539: variable 'ansible_module_compression' from source: unknown 11701 1727096135.61541: variable 'ansible_shell_type' from source: unknown 11701 1727096135.61544: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.61546: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.61553: variable 'ansible_pipelining' from source: unknown 11701 1727096135.61556: variable 'ansible_timeout' from source: unknown 11701 1727096135.61558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.62078: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096135.62083: variable 'omit' from source: magic vars 11701 1727096135.62085: starting attempt loop 11701 1727096135.62086: running the handler 11701 1727096135.62088: handler run complete 11701 1727096135.62090: attempt loop complete, returning result 11701 1727096135.62091: _execute() done 11701 1727096135.62093: dumping result to json 11701 1727096135.62095: done dumping result, returning 11701 1727096135.62097: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-a05c-c957-000000000440] 11701 1727096135.62099: sending task result for task 0afff68d-5257-a05c-c957-000000000440 11701 1727096135.62160: done sending task result for task 0afff68d-5257-a05c-c957-000000000440 11701 1727096135.62163: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11701 1727096135.62232: no more pending results, returning what we have 11701 1727096135.62235: results queue empty 11701 1727096135.62236: checking for any_errors_fatal 11701 1727096135.62238: done checking for any_errors_fatal 11701 1727096135.62238: checking for max_fail_percentage 11701 1727096135.62240: done checking for max_fail_percentage 11701 1727096135.62241: checking to see if all hosts have failed and the running result is not ok 11701 1727096135.62242: done checking to see if all hosts have failed 11701 1727096135.62243: getting the remaining hosts for this loop 11701 1727096135.62244: done getting the remaining hosts for this loop 11701 1727096135.62247: getting the next task for host managed_node3 11701 1727096135.62256: done getting next task for host managed_node3 11701 1727096135.62259: ^ task is: TASK: Stat profile file 11701 1727096135.62264: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096135.62270: getting variables 11701 1727096135.62272: in VariableManager get_vars() 11701 1727096135.62316: Calling all_inventory to load vars for managed_node3 11701 1727096135.62318: Calling groups_inventory to load vars for managed_node3 11701 1727096135.62321: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096135.62332: Calling all_plugins_play to load vars for managed_node3 11701 1727096135.62334: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096135.62337: Calling groups_plugins_play to load vars for managed_node3 11701 1727096135.64041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096135.65662: done with get_vars() 11701 1727096135.65696: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:55:35 -0400 (0:00:00.082) 0:00:19.622 ****** 11701 1727096135.65801: entering _queue_task() for managed_node3/stat 11701 1727096135.66173: worker is 1 (out of 1 available) 11701 1727096135.66186: exiting _queue_task() for managed_node3/stat 11701 1727096135.66206: done queuing things up, now waiting for results queue to drain 11701 1727096135.66207: waiting for pending results... 11701 1727096135.66462: running TaskExecutor() for managed_node3/TASK: Stat profile file 11701 1727096135.66578: in run() - task 0afff68d-5257-a05c-c957-000000000441 11701 1727096135.66600: variable 'ansible_search_path' from source: unknown 11701 1727096135.66609: variable 'ansible_search_path' from source: unknown 11701 1727096135.66655: calling self._execute() 11701 1727096135.66777: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.66874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.66877: variable 'omit' from source: magic vars 11701 1727096135.67198: variable 'ansible_distribution_major_version' from source: facts 11701 1727096135.67215: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096135.67225: variable 'omit' from source: magic vars 11701 1727096135.67282: variable 'omit' from source: magic vars 11701 1727096135.67390: variable 'profile' from source: include params 11701 1727096135.67400: variable 'item' from source: include params 11701 1727096135.67472: variable 'item' from source: include params 11701 1727096135.67496: variable 'omit' from source: magic vars 11701 1727096135.67545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096135.67592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096135.67622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096135.67643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.67774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096135.67777: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096135.67780: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.67782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.67819: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096135.67832: Set connection var ansible_timeout to 10 11701 1727096135.67842: Set connection var ansible_shell_type to sh 11701 1727096135.67856: Set connection var ansible_shell_executable to /bin/sh 11701 1727096135.67865: Set connection var ansible_connection to ssh 11701 1727096135.67885: Set connection var ansible_pipelining to False 11701 1727096135.67912: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.67921: variable 'ansible_connection' from source: unknown 11701 1727096135.67929: variable 'ansible_module_compression' from source: unknown 11701 1727096135.67935: variable 'ansible_shell_type' from source: unknown 11701 1727096135.67946: variable 'ansible_shell_executable' from source: unknown 11701 1727096135.67957: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096135.67966: variable 'ansible_pipelining' from source: unknown 11701 1727096135.67977: variable 'ansible_timeout' from source: unknown 11701 1727096135.67985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096135.68194: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096135.68211: variable 'omit' from source: magic vars 11701 1727096135.68222: starting attempt loop 11701 1727096135.68231: running the handler 11701 1727096135.68254: _low_level_execute_command(): starting 11701 1727096135.68374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096135.69002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096135.69086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096135.69142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096135.69162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096135.69244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096135.70962: stdout chunk (state=3): >>>/root <<< 11701 1727096135.71101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096135.71116: stdout chunk (state=3): >>><<< 11701 1727096135.71135: stderr chunk (state=3): >>><<< 11701 1727096135.71166: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096135.71191: _low_level_execute_command(): starting 11701 1727096135.71204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841 `" && echo ansible-tmp-1727096135.7117693-12617-102014950656841="` echo /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841 `" ) && sleep 0' 11701 1727096135.71874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096135.71891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096135.71908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096135.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096135.72048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096135.72074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096135.72098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096135.72176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096135.74156: stdout chunk (state=3): >>>ansible-tmp-1727096135.7117693-12617-102014950656841=/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841 <<< 11701 1727096135.74330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096135.74334: stdout chunk (state=3): >>><<< 11701 1727096135.74337: stderr chunk (state=3): >>><<< 11701 1727096135.74359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096135.7117693-12617-102014950656841=/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096135.74439: variable 'ansible_module_compression' from source: unknown 11701 1727096135.74491: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11701 1727096135.74536: variable 'ansible_facts' from source: unknown 11701 1727096135.74672: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py 11701 1727096135.74813: Sending initial data 11701 1727096135.74816: Sent initial data (153 bytes) 11701 1727096135.75487: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096135.75603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096135.75624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096135.75699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096135.77330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096135.77375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096135.77430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp20y10qc9 /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py <<< 11701 1727096135.77433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py" <<< 11701 1727096135.77478: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp20y10qc9" to remote "/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py" <<< 11701 1727096135.78278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096135.78282: stderr chunk (state=3): >>><<< 11701 1727096135.78285: stdout chunk (state=3): >>><<< 11701 1727096135.78287: done transferring module to remote 11701 1727096135.78289: _low_level_execute_command(): starting 11701 1727096135.78291: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/ /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py && sleep 0' 11701 1727096135.78929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096135.78944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096135.78963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096135.78987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096135.79006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096135.79025: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096135.79043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096135.79090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096135.79174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096135.79195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096135.79224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096135.79305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096135.81228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096135.81232: stdout chunk (state=3): >>><<< 11701 1727096135.81235: stderr chunk (state=3): >>><<< 11701 1727096135.81257: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096135.81269: _low_level_execute_command(): starting 11701 1727096135.81356: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/AnsiballZ_stat.py && sleep 0' 11701 1727096135.81861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096135.81915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096135.81918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096135.81921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096135.81923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096135.81925: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096135.81927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096135.82013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096135.82071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096135.82113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096135.98038: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11701 1727096135.99545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096135.99564: stderr chunk (state=3): >>><<< 11701 1727096135.99582: stdout chunk (state=3): >>><<< 11701 1727096135.99599: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096135.99655: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096135.99658: _low_level_execute_command(): starting 11701 1727096135.99733: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096135.7117693-12617-102014950656841/ > /dev/null 2>&1 && sleep 0' 11701 1727096136.00325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096136.00340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096136.00358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096136.00420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096136.00484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096136.00509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096136.00530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.00602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.02581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.02592: stdout chunk (state=3): >>><<< 11701 1727096136.02605: stderr chunk (state=3): >>><<< 11701 1727096136.02634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096136.02646: handler run complete 11701 1727096136.02679: attempt loop complete, returning result 11701 1727096136.02689: _execute() done 11701 1727096136.02697: dumping result to json 11701 1727096136.02705: done dumping result, returning 11701 1727096136.02773: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-a05c-c957-000000000441] 11701 1727096136.02776: sending task result for task 0afff68d-5257-a05c-c957-000000000441 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11701 1727096136.02911: no more pending results, returning what we have 11701 1727096136.02915: results queue empty 11701 1727096136.02916: checking for any_errors_fatal 11701 1727096136.02921: done checking for any_errors_fatal 11701 1727096136.02922: checking for max_fail_percentage 11701 1727096136.02924: done checking for max_fail_percentage 11701 1727096136.02925: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.02926: done checking to see if all hosts have failed 11701 1727096136.02927: getting the remaining hosts for this loop 11701 1727096136.02928: done getting the remaining hosts for this loop 11701 1727096136.02932: getting the next task for host managed_node3 11701 1727096136.02939: done getting next task for host managed_node3 11701 1727096136.02942: ^ task is: TASK: Set NM profile exist flag based on the profile files 11701 1727096136.02947: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.02956: getting variables 11701 1727096136.02959: in VariableManager get_vars() 11701 1727096136.03005: Calling all_inventory to load vars for managed_node3 11701 1727096136.03008: Calling groups_inventory to load vars for managed_node3 11701 1727096136.03011: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.03023: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.03027: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.03030: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.03824: done sending task result for task 0afff68d-5257-a05c-c957-000000000441 11701 1727096136.03829: WORKER PROCESS EXITING 11701 1727096136.06010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096136.07991: done with get_vars() 11701 1727096136.08024: done getting variables 11701 1727096136.08088: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:55:36 -0400 (0:00:00.423) 0:00:20.046 ****** 11701 1727096136.08186: entering _queue_task() for managed_node3/set_fact 11701 1727096136.08676: worker is 1 (out of 1 available) 11701 1727096136.08687: exiting _queue_task() for managed_node3/set_fact 11701 1727096136.08696: done queuing things up, now waiting for results queue to drain 11701 1727096136.08697: waiting for pending results... 11701 1727096136.09089: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11701 1727096136.09402: in run() - task 0afff68d-5257-a05c-c957-000000000442 11701 1727096136.09406: variable 'ansible_search_path' from source: unknown 11701 1727096136.09409: variable 'ansible_search_path' from source: unknown 11701 1727096136.09413: calling self._execute() 11701 1727096136.09619: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.09638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.09862: variable 'omit' from source: magic vars 11701 1727096136.10315: variable 'ansible_distribution_major_version' from source: facts 11701 1727096136.10335: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096136.10477: variable 'profile_stat' from source: set_fact 11701 1727096136.10496: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096136.10509: when evaluation is False, skipping this task 11701 1727096136.10519: _execute() done 11701 1727096136.10527: dumping result to json 11701 1727096136.10535: done dumping result, returning 11701 1727096136.10546: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-a05c-c957-000000000442] 11701 1727096136.10558: sending task result for task 0afff68d-5257-a05c-c957-000000000442 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096136.10828: no more pending results, returning what we have 11701 1727096136.10834: results queue empty 11701 1727096136.10836: checking for any_errors_fatal 11701 1727096136.10844: done checking for any_errors_fatal 11701 1727096136.10845: checking for max_fail_percentage 11701 1727096136.10847: done checking for max_fail_percentage 11701 1727096136.10848: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.10849: done checking to see if all hosts have failed 11701 1727096136.10850: getting the remaining hosts for this loop 11701 1727096136.10854: done getting the remaining hosts for this loop 11701 1727096136.10857: getting the next task for host managed_node3 11701 1727096136.10866: done getting next task for host managed_node3 11701 1727096136.10871: ^ task is: TASK: Get NM profile info 11701 1727096136.10876: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.10881: getting variables 11701 1727096136.10883: in VariableManager get_vars() 11701 1727096136.10929: Calling all_inventory to load vars for managed_node3 11701 1727096136.10931: Calling groups_inventory to load vars for managed_node3 11701 1727096136.10934: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.10948: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.10954: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.10957: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.11485: done sending task result for task 0afff68d-5257-a05c-c957-000000000442 11701 1727096136.11489: WORKER PROCESS EXITING 11701 1727096136.12620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096136.14366: done with get_vars() 11701 1727096136.14402: done getting variables 11701 1727096136.14474: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:55:36 -0400 (0:00:00.063) 0:00:20.109 ****** 11701 1727096136.14508: entering _queue_task() for managed_node3/shell 11701 1727096136.15087: worker is 1 (out of 1 available) 11701 1727096136.15098: exiting _queue_task() for managed_node3/shell 11701 1727096136.15107: done queuing things up, now waiting for results queue to drain 11701 1727096136.15109: waiting for pending results... 11701 1727096136.15487: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11701 1727096136.15629: in run() - task 0afff68d-5257-a05c-c957-000000000443 11701 1727096136.15692: variable 'ansible_search_path' from source: unknown 11701 1727096136.15701: variable 'ansible_search_path' from source: unknown 11701 1727096136.15745: calling self._execute() 11701 1727096136.15860: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.15882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.15900: variable 'omit' from source: magic vars 11701 1727096136.16300: variable 'ansible_distribution_major_version' from source: facts 11701 1727096136.16329: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096136.16341: variable 'omit' from source: magic vars 11701 1727096136.16400: variable 'omit' from source: magic vars 11701 1727096136.16531: variable 'profile' from source: include params 11701 1727096136.16535: variable 'item' from source: include params 11701 1727096136.16603: variable 'item' from source: include params 11701 1727096136.16639: variable 'omit' from source: magic vars 11701 1727096136.17087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096136.17091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096136.17093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096136.17096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096136.17098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096136.17100: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096136.17103: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.17104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.17419: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096136.17434: Set connection var ansible_timeout to 10 11701 1727096136.17442: Set connection var ansible_shell_type to sh 11701 1727096136.17454: Set connection var ansible_shell_executable to /bin/sh 11701 1727096136.17462: Set connection var ansible_connection to ssh 11701 1727096136.17478: Set connection var ansible_pipelining to False 11701 1727096136.17505: variable 'ansible_shell_executable' from source: unknown 11701 1727096136.17519: variable 'ansible_connection' from source: unknown 11701 1727096136.17527: variable 'ansible_module_compression' from source: unknown 11701 1727096136.17534: variable 'ansible_shell_type' from source: unknown 11701 1727096136.17542: variable 'ansible_shell_executable' from source: unknown 11701 1727096136.17739: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.17742: variable 'ansible_pipelining' from source: unknown 11701 1727096136.17745: variable 'ansible_timeout' from source: unknown 11701 1727096136.17747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.17914: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096136.17976: variable 'omit' from source: magic vars 11701 1727096136.17986: starting attempt loop 11701 1727096136.17994: running the handler 11701 1727096136.18010: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096136.18091: _low_level_execute_command(): starting 11701 1727096136.18105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096136.19690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096136.19797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096136.19928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.19964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.21690: stdout chunk (state=3): >>>/root <<< 11701 1727096136.21783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.21855: stderr chunk (state=3): >>><<< 11701 1727096136.22060: stdout chunk (state=3): >>><<< 11701 1727096136.22065: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096136.22069: _low_level_execute_command(): starting 11701 1727096136.22073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276 `" && echo ansible-tmp-1727096136.2199147-12629-27215051937276="` echo /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276 `" ) && sleep 0' 11701 1727096136.23218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096136.23235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096136.23498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096136.23588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.25561: stdout chunk (state=3): >>>ansible-tmp-1727096136.2199147-12629-27215051937276=/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276 <<< 11701 1727096136.25662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.25755: stderr chunk (state=3): >>><<< 11701 1727096136.25759: stdout chunk (state=3): >>><<< 11701 1727096136.26077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096136.2199147-12629-27215051937276=/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096136.26081: variable 'ansible_module_compression' from source: unknown 11701 1727096136.26084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096136.26115: variable 'ansible_facts' from source: unknown 11701 1727096136.26310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py 11701 1727096136.26544: Sending initial data 11701 1727096136.26547: Sent initial data (155 bytes) 11701 1727096136.28219: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096136.28224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096136.28228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096136.28230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096136.28233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096136.28386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.28429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.30084: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096136.30090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096136.30129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp7a3gh1ie /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py <<< 11701 1727096136.30133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py" <<< 11701 1727096136.30223: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp7a3gh1ie" to remote "/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py" <<< 11701 1727096136.31124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.31130: stderr chunk (state=3): >>><<< 11701 1727096136.31133: stdout chunk (state=3): >>><<< 11701 1727096136.31159: done transferring module to remote 11701 1727096136.31171: _low_level_execute_command(): starting 11701 1727096136.31177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/ /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py && sleep 0' 11701 1727096136.32014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096136.32018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096136.32020: stderr chunk (state=3): >>>debug2: match found <<< 11701 1727096136.32044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096136.32052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.32128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.34354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.34360: stdout chunk (state=3): >>><<< 11701 1727096136.34370: stderr chunk (state=3): >>><<< 11701 1727096136.34500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096136.34503: _low_level_execute_command(): starting 11701 1727096136.34506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/AnsiballZ_command.py && sleep 0' 11701 1727096136.35474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096136.35478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096136.35687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096136.35691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096136.35770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096136.35774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096136.35777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096136.35800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.35871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.53493: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:55:36.509680", "end": "2024-09-23 08:55:36.530591", "delta": "0:00:00.020911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096136.55061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096136.55066: stderr chunk (state=3): >>><<< 11701 1727096136.55071: stdout chunk (state=3): >>><<< 11701 1727096136.55275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:55:36.509680", "end": "2024-09-23 08:55:36.530591", "delta": "0:00:00.020911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096136.55279: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096136.55283: _low_level_execute_command(): starting 11701 1727096136.55285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096136.2199147-12629-27215051937276/ > /dev/null 2>&1 && sleep 0' 11701 1727096136.56314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096136.56573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096136.56878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096136.56951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096136.58817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096136.58874: stderr chunk (state=3): >>><<< 11701 1727096136.59086: stdout chunk (state=3): >>><<< 11701 1727096136.59111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096136.59115: handler run complete 11701 1727096136.59220: Evaluated conditional (False): False 11701 1727096136.59223: attempt loop complete, returning result 11701 1727096136.59225: _execute() done 11701 1727096136.59226: dumping result to json 11701 1727096136.59228: done dumping result, returning 11701 1727096136.59230: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-a05c-c957-000000000443] 11701 1727096136.59231: sending task result for task 0afff68d-5257-a05c-c957-000000000443 11701 1727096136.59299: done sending task result for task 0afff68d-5257-a05c-c957-000000000443 11701 1727096136.59303: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020911", "end": "2024-09-23 08:55:36.530591", "rc": 0, "start": "2024-09-23 08:55:36.509680" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11701 1727096136.59397: no more pending results, returning what we have 11701 1727096136.59401: results queue empty 11701 1727096136.59402: checking for any_errors_fatal 11701 1727096136.59406: done checking for any_errors_fatal 11701 1727096136.59407: checking for max_fail_percentage 11701 1727096136.59409: done checking for max_fail_percentage 11701 1727096136.59410: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.59411: done checking to see if all hosts have failed 11701 1727096136.59411: getting the remaining hosts for this loop 11701 1727096136.59412: done getting the remaining hosts for this loop 11701 1727096136.59415: getting the next task for host managed_node3 11701 1727096136.59423: done getting next task for host managed_node3 11701 1727096136.59426: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096136.59435: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.59439: getting variables 11701 1727096136.59441: in VariableManager get_vars() 11701 1727096136.59484: Calling all_inventory to load vars for managed_node3 11701 1727096136.59487: Calling groups_inventory to load vars for managed_node3 11701 1727096136.59490: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.59501: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.59505: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.59508: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.62821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096136.66255: done with get_vars() 11701 1727096136.66403: done getting variables 11701 1727096136.66460: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:55:36 -0400 (0:00:00.520) 0:00:20.630 ****** 11701 1727096136.66602: entering _queue_task() for managed_node3/set_fact 11701 1727096136.67477: worker is 1 (out of 1 available) 11701 1727096136.67489: exiting _queue_task() for managed_node3/set_fact 11701 1727096136.67499: done queuing things up, now waiting for results queue to drain 11701 1727096136.67500: waiting for pending results... 11701 1727096136.68096: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11701 1727096136.68219: in run() - task 0afff68d-5257-a05c-c957-000000000444 11701 1727096136.68254: variable 'ansible_search_path' from source: unknown 11701 1727096136.68347: variable 'ansible_search_path' from source: unknown 11701 1727096136.68456: calling self._execute() 11701 1727096136.68675: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.68679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.68682: variable 'omit' from source: magic vars 11701 1727096136.69538: variable 'ansible_distribution_major_version' from source: facts 11701 1727096136.69608: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096136.70140: variable 'nm_profile_exists' from source: set_fact 11701 1727096136.70144: Evaluated conditional (nm_profile_exists.rc == 0): True 11701 1727096136.70147: variable 'omit' from source: magic vars 11701 1727096136.70149: variable 'omit' from source: magic vars 11701 1727096136.70359: variable 'omit' from source: magic vars 11701 1727096136.70363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096136.70366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096136.70487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096136.70514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096136.70531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096136.70569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096136.70610: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.70619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.70905: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096136.70919: Set connection var ansible_timeout to 10 11701 1727096136.70932: Set connection var ansible_shell_type to sh 11701 1727096136.70943: Set connection var ansible_shell_executable to /bin/sh 11701 1727096136.70949: Set connection var ansible_connection to ssh 11701 1727096136.70963: Set connection var ansible_pipelining to False 11701 1727096136.71033: variable 'ansible_shell_executable' from source: unknown 11701 1727096136.71153: variable 'ansible_connection' from source: unknown 11701 1727096136.71157: variable 'ansible_module_compression' from source: unknown 11701 1727096136.71159: variable 'ansible_shell_type' from source: unknown 11701 1727096136.71161: variable 'ansible_shell_executable' from source: unknown 11701 1727096136.71163: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.71165: variable 'ansible_pipelining' from source: unknown 11701 1727096136.71169: variable 'ansible_timeout' from source: unknown 11701 1727096136.71172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.71490: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096136.71669: variable 'omit' from source: magic vars 11701 1727096136.71672: starting attempt loop 11701 1727096136.71675: running the handler 11701 1727096136.71678: handler run complete 11701 1727096136.71680: attempt loop complete, returning result 11701 1727096136.71682: _execute() done 11701 1727096136.71686: dumping result to json 11701 1727096136.71688: done dumping result, returning 11701 1727096136.71691: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-a05c-c957-000000000444] 11701 1727096136.71776: sending task result for task 0afff68d-5257-a05c-c957-000000000444 11701 1727096136.71846: done sending task result for task 0afff68d-5257-a05c-c957-000000000444 11701 1727096136.71849: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11701 1727096136.71938: no more pending results, returning what we have 11701 1727096136.71941: results queue empty 11701 1727096136.71942: checking for any_errors_fatal 11701 1727096136.71948: done checking for any_errors_fatal 11701 1727096136.71949: checking for max_fail_percentage 11701 1727096136.71951: done checking for max_fail_percentage 11701 1727096136.71952: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.71953: done checking to see if all hosts have failed 11701 1727096136.71954: getting the remaining hosts for this loop 11701 1727096136.71955: done getting the remaining hosts for this loop 11701 1727096136.72076: getting the next task for host managed_node3 11701 1727096136.72087: done getting next task for host managed_node3 11701 1727096136.72090: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096136.72095: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.72099: getting variables 11701 1727096136.72101: in VariableManager get_vars() 11701 1727096136.72147: Calling all_inventory to load vars for managed_node3 11701 1727096136.72150: Calling groups_inventory to load vars for managed_node3 11701 1727096136.72153: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.72166: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.72486: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.72491: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.75348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096136.79087: done with get_vars() 11701 1727096136.79171: done getting variables 11701 1727096136.79351: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096136.79583: variable 'profile' from source: include params 11701 1727096136.79587: variable 'item' from source: include params 11701 1727096136.79647: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:55:36 -0400 (0:00:00.132) 0:00:20.762 ****** 11701 1727096136.79811: entering _queue_task() for managed_node3/command 11701 1727096136.80532: worker is 1 (out of 1 available) 11701 1727096136.80545: exiting _queue_task() for managed_node3/command 11701 1727096136.80557: done queuing things up, now waiting for results queue to drain 11701 1727096136.80558: waiting for pending results... 11701 1727096136.81592: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11701 1727096136.81598: in run() - task 0afff68d-5257-a05c-c957-000000000446 11701 1727096136.81602: variable 'ansible_search_path' from source: unknown 11701 1727096136.81604: variable 'ansible_search_path' from source: unknown 11701 1727096136.81607: calling self._execute() 11701 1727096136.81762: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.82015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.82020: variable 'omit' from source: magic vars 11701 1727096136.82698: variable 'ansible_distribution_major_version' from source: facts 11701 1727096136.82722: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096136.82964: variable 'profile_stat' from source: set_fact 11701 1727096136.83009: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096136.83039: when evaluation is False, skipping this task 11701 1727096136.83143: _execute() done 11701 1727096136.83146: dumping result to json 11701 1727096136.83149: done dumping result, returning 11701 1727096136.83152: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-a05c-c957-000000000446] 11701 1727096136.83154: sending task result for task 0afff68d-5257-a05c-c957-000000000446 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096136.83370: no more pending results, returning what we have 11701 1727096136.83375: results queue empty 11701 1727096136.83377: checking for any_errors_fatal 11701 1727096136.83382: done checking for any_errors_fatal 11701 1727096136.83383: checking for max_fail_percentage 11701 1727096136.83385: done checking for max_fail_percentage 11701 1727096136.83386: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.83387: done checking to see if all hosts have failed 11701 1727096136.83388: getting the remaining hosts for this loop 11701 1727096136.83390: done getting the remaining hosts for this loop 11701 1727096136.83393: getting the next task for host managed_node3 11701 1727096136.83402: done getting next task for host managed_node3 11701 1727096136.83405: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11701 1727096136.83410: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.83415: getting variables 11701 1727096136.83417: in VariableManager get_vars() 11701 1727096136.83469: Calling all_inventory to load vars for managed_node3 11701 1727096136.83472: Calling groups_inventory to load vars for managed_node3 11701 1727096136.83476: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.83489: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.83493: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.83496: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.84275: done sending task result for task 0afff68d-5257-a05c-c957-000000000446 11701 1727096136.84278: WORKER PROCESS EXITING 11701 1727096136.86720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096136.90113: done with get_vars() 11701 1727096136.90146: done getting variables 11701 1727096136.90326: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096136.90562: variable 'profile' from source: include params 11701 1727096136.90566: variable 'item' from source: include params 11701 1727096136.90694: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:55:36 -0400 (0:00:00.109) 0:00:20.871 ****** 11701 1727096136.90974: entering _queue_task() for managed_node3/set_fact 11701 1727096136.91493: worker is 1 (out of 1 available) 11701 1727096136.91729: exiting _queue_task() for managed_node3/set_fact 11701 1727096136.91740: done queuing things up, now waiting for results queue to drain 11701 1727096136.91742: waiting for pending results... 11701 1727096136.92161: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11701 1727096136.92494: in run() - task 0afff68d-5257-a05c-c957-000000000447 11701 1727096136.92499: variable 'ansible_search_path' from source: unknown 11701 1727096136.92502: variable 'ansible_search_path' from source: unknown 11701 1727096136.92604: calling self._execute() 11701 1727096136.92775: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096136.92831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096136.92846: variable 'omit' from source: magic vars 11701 1727096136.93755: variable 'ansible_distribution_major_version' from source: facts 11701 1727096136.93759: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096136.93928: variable 'profile_stat' from source: set_fact 11701 1727096136.93989: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096136.93999: when evaluation is False, skipping this task 11701 1727096136.94024: _execute() done 11701 1727096136.94033: dumping result to json 11701 1727096136.94127: done dumping result, returning 11701 1727096136.94130: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-a05c-c957-000000000447] 11701 1727096136.94133: sending task result for task 0afff68d-5257-a05c-c957-000000000447 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096136.94391: no more pending results, returning what we have 11701 1727096136.94396: results queue empty 11701 1727096136.94398: checking for any_errors_fatal 11701 1727096136.94406: done checking for any_errors_fatal 11701 1727096136.94407: checking for max_fail_percentage 11701 1727096136.94409: done checking for max_fail_percentage 11701 1727096136.94410: checking to see if all hosts have failed and the running result is not ok 11701 1727096136.94411: done checking to see if all hosts have failed 11701 1727096136.94412: getting the remaining hosts for this loop 11701 1727096136.94414: done getting the remaining hosts for this loop 11701 1727096136.94418: getting the next task for host managed_node3 11701 1727096136.94426: done getting next task for host managed_node3 11701 1727096136.94431: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11701 1727096136.94436: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096136.94441: getting variables 11701 1727096136.94443: in VariableManager get_vars() 11701 1727096136.94501: Calling all_inventory to load vars for managed_node3 11701 1727096136.94504: Calling groups_inventory to load vars for managed_node3 11701 1727096136.94508: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096136.94792: Calling all_plugins_play to load vars for managed_node3 11701 1727096136.94797: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096136.94800: Calling groups_plugins_play to load vars for managed_node3 11701 1727096136.95402: done sending task result for task 0afff68d-5257-a05c-c957-000000000447 11701 1727096136.95406: WORKER PROCESS EXITING 11701 1727096137.10438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.13376: done with get_vars() 11701 1727096137.13416: done getting variables 11701 1727096137.13471: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096137.13578: variable 'profile' from source: include params 11701 1727096137.13582: variable 'item' from source: include params 11701 1727096137.13649: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:55:37 -0400 (0:00:00.229) 0:00:21.101 ****** 11701 1727096137.13681: entering _queue_task() for managed_node3/command 11701 1727096137.14211: worker is 1 (out of 1 available) 11701 1727096137.14223: exiting _queue_task() for managed_node3/command 11701 1727096137.14235: done queuing things up, now waiting for results queue to drain 11701 1727096137.14236: waiting for pending results... 11701 1727096137.14546: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 11701 1727096137.14703: in run() - task 0afff68d-5257-a05c-c957-000000000448 11701 1727096137.14735: variable 'ansible_search_path' from source: unknown 11701 1727096137.14743: variable 'ansible_search_path' from source: unknown 11701 1727096137.14786: calling self._execute() 11701 1727096137.14895: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.14907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.14920: variable 'omit' from source: magic vars 11701 1727096137.15320: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.15337: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.15472: variable 'profile_stat' from source: set_fact 11701 1727096137.15499: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096137.15508: when evaluation is False, skipping this task 11701 1727096137.15514: _execute() done 11701 1727096137.15521: dumping result to json 11701 1727096137.15528: done dumping result, returning 11701 1727096137.15538: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-a05c-c957-000000000448] 11701 1727096137.15547: sending task result for task 0afff68d-5257-a05c-c957-000000000448 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096137.15751: no more pending results, returning what we have 11701 1727096137.15755: results queue empty 11701 1727096137.15757: checking for any_errors_fatal 11701 1727096137.15765: done checking for any_errors_fatal 11701 1727096137.15765: checking for max_fail_percentage 11701 1727096137.15769: done checking for max_fail_percentage 11701 1727096137.15770: checking to see if all hosts have failed and the running result is not ok 11701 1727096137.15771: done checking to see if all hosts have failed 11701 1727096137.15772: getting the remaining hosts for this loop 11701 1727096137.15773: done getting the remaining hosts for this loop 11701 1727096137.15777: getting the next task for host managed_node3 11701 1727096137.15786: done getting next task for host managed_node3 11701 1727096137.15789: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11701 1727096137.15793: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096137.15798: getting variables 11701 1727096137.15799: in VariableManager get_vars() 11701 1727096137.15853: Calling all_inventory to load vars for managed_node3 11701 1727096137.15856: Calling groups_inventory to load vars for managed_node3 11701 1727096137.15859: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096137.15925: Calling all_plugins_play to load vars for managed_node3 11701 1727096137.15929: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096137.15933: Calling groups_plugins_play to load vars for managed_node3 11701 1727096137.16627: done sending task result for task 0afff68d-5257-a05c-c957-000000000448 11701 1727096137.16631: WORKER PROCESS EXITING 11701 1727096137.18020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.19850: done with get_vars() 11701 1727096137.19891: done getting variables 11701 1727096137.19956: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096137.20080: variable 'profile' from source: include params 11701 1727096137.20084: variable 'item' from source: include params 11701 1727096137.20149: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:55:37 -0400 (0:00:00.064) 0:00:21.166 ****** 11701 1727096137.20183: entering _queue_task() for managed_node3/set_fact 11701 1727096137.20784: worker is 1 (out of 1 available) 11701 1727096137.20794: exiting _queue_task() for managed_node3/set_fact 11701 1727096137.20803: done queuing things up, now waiting for results queue to drain 11701 1727096137.20804: waiting for pending results... 11701 1727096137.20883: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11701 1727096137.21037: in run() - task 0afff68d-5257-a05c-c957-000000000449 11701 1727096137.21060: variable 'ansible_search_path' from source: unknown 11701 1727096137.21069: variable 'ansible_search_path' from source: unknown 11701 1727096137.21112: calling self._execute() 11701 1727096137.21223: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.21234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.21257: variable 'omit' from source: magic vars 11701 1727096137.21639: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.21657: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.21801: variable 'profile_stat' from source: set_fact 11701 1727096137.21821: Evaluated conditional (profile_stat.stat.exists): False 11701 1727096137.21831: when evaluation is False, skipping this task 11701 1727096137.21901: _execute() done 11701 1727096137.21908: dumping result to json 11701 1727096137.21912: done dumping result, returning 11701 1727096137.21915: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-a05c-c957-000000000449] 11701 1727096137.22014: sending task result for task 0afff68d-5257-a05c-c957-000000000449 11701 1727096137.22089: done sending task result for task 0afff68d-5257-a05c-c957-000000000449 11701 1727096137.22092: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11701 1727096137.22150: no more pending results, returning what we have 11701 1727096137.22155: results queue empty 11701 1727096137.22156: checking for any_errors_fatal 11701 1727096137.22162: done checking for any_errors_fatal 11701 1727096137.22163: checking for max_fail_percentage 11701 1727096137.22165: done checking for max_fail_percentage 11701 1727096137.22166: checking to see if all hosts have failed and the running result is not ok 11701 1727096137.22169: done checking to see if all hosts have failed 11701 1727096137.22170: getting the remaining hosts for this loop 11701 1727096137.22172: done getting the remaining hosts for this loop 11701 1727096137.22175: getting the next task for host managed_node3 11701 1727096137.22185: done getting next task for host managed_node3 11701 1727096137.22189: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11701 1727096137.22192: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096137.22197: getting variables 11701 1727096137.22199: in VariableManager get_vars() 11701 1727096137.22371: Calling all_inventory to load vars for managed_node3 11701 1727096137.22374: Calling groups_inventory to load vars for managed_node3 11701 1727096137.22377: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096137.22390: Calling all_plugins_play to load vars for managed_node3 11701 1727096137.22392: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096137.22395: Calling groups_plugins_play to load vars for managed_node3 11701 1727096137.24933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.27466: done with get_vars() 11701 1727096137.27503: done getting variables 11701 1727096137.27589: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096137.27721: variable 'profile' from source: include params 11701 1727096137.27748: variable 'item' from source: include params 11701 1727096137.27819: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:55:37 -0400 (0:00:00.076) 0:00:21.243 ****** 11701 1727096137.27852: entering _queue_task() for managed_node3/assert 11701 1727096137.28211: worker is 1 (out of 1 available) 11701 1727096137.28225: exiting _queue_task() for managed_node3/assert 11701 1727096137.28237: done queuing things up, now waiting for results queue to drain 11701 1727096137.28239: waiting for pending results... 11701 1727096137.28833: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 11701 1727096137.28839: in run() - task 0afff68d-5257-a05c-c957-00000000026e 11701 1727096137.28842: variable 'ansible_search_path' from source: unknown 11701 1727096137.28845: variable 'ansible_search_path' from source: unknown 11701 1727096137.28848: calling self._execute() 11701 1727096137.28852: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.28857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.28861: variable 'omit' from source: magic vars 11701 1727096137.29247: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.29265: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.29272: variable 'omit' from source: magic vars 11701 1727096137.29394: variable 'omit' from source: magic vars 11701 1727096137.29443: variable 'profile' from source: include params 11701 1727096137.29447: variable 'item' from source: include params 11701 1727096137.29485: variable 'item' from source: include params 11701 1727096137.29503: variable 'omit' from source: magic vars 11701 1727096137.29599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096137.29604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096137.29606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096137.29636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.29800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.29920: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096137.29923: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.29926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.30274: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096137.30277: Set connection var ansible_timeout to 10 11701 1727096137.30279: Set connection var ansible_shell_type to sh 11701 1727096137.30281: Set connection var ansible_shell_executable to /bin/sh 11701 1727096137.30283: Set connection var ansible_connection to ssh 11701 1727096137.30387: Set connection var ansible_pipelining to False 11701 1727096137.30409: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.30412: variable 'ansible_connection' from source: unknown 11701 1727096137.30415: variable 'ansible_module_compression' from source: unknown 11701 1727096137.30417: variable 'ansible_shell_type' from source: unknown 11701 1727096137.30420: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.30422: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.30427: variable 'ansible_pipelining' from source: unknown 11701 1727096137.30430: variable 'ansible_timeout' from source: unknown 11701 1727096137.30433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.30974: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096137.30978: variable 'omit' from source: magic vars 11701 1727096137.30981: starting attempt loop 11701 1727096137.30984: running the handler 11701 1727096137.31034: variable 'lsr_net_profile_exists' from source: set_fact 11701 1727096137.31037: Evaluated conditional (lsr_net_profile_exists): True 11701 1727096137.31040: handler run complete 11701 1727096137.31060: attempt loop complete, returning result 11701 1727096137.31064: _execute() done 11701 1727096137.31066: dumping result to json 11701 1727096137.31071: done dumping result, returning 11701 1727096137.31343: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0afff68d-5257-a05c-c957-00000000026e] 11701 1727096137.31345: sending task result for task 0afff68d-5257-a05c-c957-00000000026e 11701 1727096137.31414: done sending task result for task 0afff68d-5257-a05c-c957-00000000026e 11701 1727096137.31418: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096137.31476: no more pending results, returning what we have 11701 1727096137.31480: results queue empty 11701 1727096137.31481: checking for any_errors_fatal 11701 1727096137.31489: done checking for any_errors_fatal 11701 1727096137.31490: checking for max_fail_percentage 11701 1727096137.31492: done checking for max_fail_percentage 11701 1727096137.31493: checking to see if all hosts have failed and the running result is not ok 11701 1727096137.31495: done checking to see if all hosts have failed 11701 1727096137.31495: getting the remaining hosts for this loop 11701 1727096137.31497: done getting the remaining hosts for this loop 11701 1727096137.31500: getting the next task for host managed_node3 11701 1727096137.31506: done getting next task for host managed_node3 11701 1727096137.31509: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11701 1727096137.31512: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096137.31517: getting variables 11701 1727096137.31518: in VariableManager get_vars() 11701 1727096137.31570: Calling all_inventory to load vars for managed_node3 11701 1727096137.31573: Calling groups_inventory to load vars for managed_node3 11701 1727096137.31577: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096137.31589: Calling all_plugins_play to load vars for managed_node3 11701 1727096137.31592: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096137.31595: Calling groups_plugins_play to load vars for managed_node3 11701 1727096137.33285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.35571: done with get_vars() 11701 1727096137.35620: done getting variables 11701 1727096137.35689: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096137.35812: variable 'profile' from source: include params 11701 1727096137.35816: variable 'item' from source: include params 11701 1727096137.35883: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:55:37 -0400 (0:00:00.080) 0:00:21.323 ****** 11701 1727096137.35925: entering _queue_task() for managed_node3/assert 11701 1727096137.36289: worker is 1 (out of 1 available) 11701 1727096137.36302: exiting _queue_task() for managed_node3/assert 11701 1727096137.36314: done queuing things up, now waiting for results queue to drain 11701 1727096137.36315: waiting for pending results... 11701 1727096137.36838: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11701 1727096137.36843: in run() - task 0afff68d-5257-a05c-c957-00000000026f 11701 1727096137.36846: variable 'ansible_search_path' from source: unknown 11701 1727096137.36849: variable 'ansible_search_path' from source: unknown 11701 1727096137.36851: calling self._execute() 11701 1727096137.36935: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.36939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.36942: variable 'omit' from source: magic vars 11701 1727096137.37371: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.37374: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.37378: variable 'omit' from source: magic vars 11701 1727096137.37381: variable 'omit' from source: magic vars 11701 1727096137.37421: variable 'profile' from source: include params 11701 1727096137.37425: variable 'item' from source: include params 11701 1727096137.37499: variable 'item' from source: include params 11701 1727096137.37518: variable 'omit' from source: magic vars 11701 1727096137.37583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096137.37600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096137.37619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096137.37637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.37649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.37800: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096137.37804: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.37806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.37809: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096137.37811: Set connection var ansible_timeout to 10 11701 1727096137.37814: Set connection var ansible_shell_type to sh 11701 1727096137.37816: Set connection var ansible_shell_executable to /bin/sh 11701 1727096137.37818: Set connection var ansible_connection to ssh 11701 1727096137.37820: Set connection var ansible_pipelining to False 11701 1727096137.37856: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.37859: variable 'ansible_connection' from source: unknown 11701 1727096137.37882: variable 'ansible_module_compression' from source: unknown 11701 1727096137.37885: variable 'ansible_shell_type' from source: unknown 11701 1727096137.37887: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.37890: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.37892: variable 'ansible_pipelining' from source: unknown 11701 1727096137.37908: variable 'ansible_timeout' from source: unknown 11701 1727096137.37923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.38128: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096137.38151: variable 'omit' from source: magic vars 11701 1727096137.38166: starting attempt loop 11701 1727096137.38178: running the handler 11701 1727096137.38344: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11701 1727096137.38453: Evaluated conditional (lsr_net_profile_ansible_managed): True 11701 1727096137.38456: handler run complete 11701 1727096137.38458: attempt loop complete, returning result 11701 1727096137.38460: _execute() done 11701 1727096137.38462: dumping result to json 11701 1727096137.38463: done dumping result, returning 11701 1727096137.38465: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0afff68d-5257-a05c-c957-00000000026f] 11701 1727096137.38467: sending task result for task 0afff68d-5257-a05c-c957-00000000026f 11701 1727096137.38536: done sending task result for task 0afff68d-5257-a05c-c957-00000000026f 11701 1727096137.38541: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096137.38612: no more pending results, returning what we have 11701 1727096137.38617: results queue empty 11701 1727096137.38618: checking for any_errors_fatal 11701 1727096137.38627: done checking for any_errors_fatal 11701 1727096137.38628: checking for max_fail_percentage 11701 1727096137.38629: done checking for max_fail_percentage 11701 1727096137.38631: checking to see if all hosts have failed and the running result is not ok 11701 1727096137.38632: done checking to see if all hosts have failed 11701 1727096137.38632: getting the remaining hosts for this loop 11701 1727096137.38634: done getting the remaining hosts for this loop 11701 1727096137.38637: getting the next task for host managed_node3 11701 1727096137.38644: done getting next task for host managed_node3 11701 1727096137.38646: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11701 1727096137.38649: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096137.38657: getting variables 11701 1727096137.38662: in VariableManager get_vars() 11701 1727096137.38715: Calling all_inventory to load vars for managed_node3 11701 1727096137.38718: Calling groups_inventory to load vars for managed_node3 11701 1727096137.38720: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096137.38732: Calling all_plugins_play to load vars for managed_node3 11701 1727096137.38737: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096137.38741: Calling groups_plugins_play to load vars for managed_node3 11701 1727096137.41132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.43362: done with get_vars() 11701 1727096137.43595: done getting variables 11701 1727096137.43654: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096137.43972: variable 'profile' from source: include params 11701 1727096137.43976: variable 'item' from source: include params 11701 1727096137.44036: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:55:37 -0400 (0:00:00.081) 0:00:21.405 ****** 11701 1727096137.44075: entering _queue_task() for managed_node3/assert 11701 1727096137.44865: worker is 1 (out of 1 available) 11701 1727096137.44883: exiting _queue_task() for managed_node3/assert 11701 1727096137.44897: done queuing things up, now waiting for results queue to drain 11701 1727096137.44898: waiting for pending results... 11701 1727096137.45522: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 11701 1727096137.45632: in run() - task 0afff68d-5257-a05c-c957-000000000270 11701 1727096137.45775: variable 'ansible_search_path' from source: unknown 11701 1727096137.45782: variable 'ansible_search_path' from source: unknown 11701 1727096137.45788: calling self._execute() 11701 1727096137.46234: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.46334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.46351: variable 'omit' from source: magic vars 11701 1727096137.47575: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.47685: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.47704: variable 'omit' from source: magic vars 11701 1727096137.47938: variable 'omit' from source: magic vars 11701 1727096137.48245: variable 'profile' from source: include params 11701 1727096137.48249: variable 'item' from source: include params 11701 1727096137.48620: variable 'item' from source: include params 11701 1727096137.48624: variable 'omit' from source: magic vars 11701 1727096137.48627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096137.48689: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096137.48710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096137.48727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.48741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.48772: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096137.48776: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.48778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.49170: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096137.49180: Set connection var ansible_timeout to 10 11701 1727096137.49183: Set connection var ansible_shell_type to sh 11701 1727096137.49189: Set connection var ansible_shell_executable to /bin/sh 11701 1727096137.49191: Set connection var ansible_connection to ssh 11701 1727096137.49203: Set connection var ansible_pipelining to False 11701 1727096137.49242: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.49246: variable 'ansible_connection' from source: unknown 11701 1727096137.49248: variable 'ansible_module_compression' from source: unknown 11701 1727096137.49253: variable 'ansible_shell_type' from source: unknown 11701 1727096137.49256: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.49258: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.49260: variable 'ansible_pipelining' from source: unknown 11701 1727096137.49263: variable 'ansible_timeout' from source: unknown 11701 1727096137.49265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.49544: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096137.49556: variable 'omit' from source: magic vars 11701 1727096137.49559: starting attempt loop 11701 1727096137.49561: running the handler 11701 1727096137.50092: variable 'lsr_net_profile_fingerprint' from source: set_fact 11701 1727096137.50094: Evaluated conditional (lsr_net_profile_fingerprint): True 11701 1727096137.50097: handler run complete 11701 1727096137.50099: attempt loop complete, returning result 11701 1727096137.50100: _execute() done 11701 1727096137.50102: dumping result to json 11701 1727096137.50104: done dumping result, returning 11701 1727096137.50253: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0afff68d-5257-a05c-c957-000000000270] 11701 1727096137.50256: sending task result for task 0afff68d-5257-a05c-c957-000000000270 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11701 1727096137.50561: no more pending results, returning what we have 11701 1727096137.50565: results queue empty 11701 1727096137.50566: checking for any_errors_fatal 11701 1727096137.50576: done checking for any_errors_fatal 11701 1727096137.50577: checking for max_fail_percentage 11701 1727096137.50579: done checking for max_fail_percentage 11701 1727096137.50580: checking to see if all hosts have failed and the running result is not ok 11701 1727096137.50581: done checking to see if all hosts have failed 11701 1727096137.50581: getting the remaining hosts for this loop 11701 1727096137.50582: done getting the remaining hosts for this loop 11701 1727096137.50586: getting the next task for host managed_node3 11701 1727096137.50593: done getting next task for host managed_node3 11701 1727096137.50596: ^ task is: TASK: ** TEST check polling interval 11701 1727096137.50599: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096137.50603: getting variables 11701 1727096137.50604: in VariableManager get_vars() 11701 1727096137.50649: Calling all_inventory to load vars for managed_node3 11701 1727096137.50654: Calling groups_inventory to load vars for managed_node3 11701 1727096137.50656: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096137.50782: Calling all_plugins_play to load vars for managed_node3 11701 1727096137.50787: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096137.50792: done sending task result for task 0afff68d-5257-a05c-c957-000000000270 11701 1727096137.50795: WORKER PROCESS EXITING 11701 1727096137.50798: Calling groups_plugins_play to load vars for managed_node3 11701 1727096137.53649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096137.56105: done with get_vars() 11701 1727096137.56149: done getting variables 11701 1727096137.56219: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Monday 23 September 2024 08:55:37 -0400 (0:00:00.122) 0:00:21.527 ****** 11701 1727096137.56300: entering _queue_task() for managed_node3/command 11701 1727096137.56899: worker is 1 (out of 1 available) 11701 1727096137.56911: exiting _queue_task() for managed_node3/command 11701 1727096137.56920: done queuing things up, now waiting for results queue to drain 11701 1727096137.56921: waiting for pending results... 11701 1727096137.57255: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 11701 1727096137.57664: in run() - task 0afff68d-5257-a05c-c957-000000000071 11701 1727096137.57672: variable 'ansible_search_path' from source: unknown 11701 1727096137.57675: calling self._execute() 11701 1727096137.57917: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.58011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.58041: variable 'omit' from source: magic vars 11701 1727096137.59009: variable 'ansible_distribution_major_version' from source: facts 11701 1727096137.59048: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096137.59173: variable 'omit' from source: magic vars 11701 1727096137.59176: variable 'omit' from source: magic vars 11701 1727096137.59443: variable 'controller_device' from source: play vars 11701 1727096137.59544: variable 'omit' from source: magic vars 11701 1727096137.59607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096137.59732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096137.59914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096137.59964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.60017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096137.60224: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096137.60229: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.60231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.60692: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096137.60695: Set connection var ansible_timeout to 10 11701 1727096137.60698: Set connection var ansible_shell_type to sh 11701 1727096137.60700: Set connection var ansible_shell_executable to /bin/sh 11701 1727096137.60702: Set connection var ansible_connection to ssh 11701 1727096137.60704: Set connection var ansible_pipelining to False 11701 1727096137.60706: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.60707: variable 'ansible_connection' from source: unknown 11701 1727096137.60710: variable 'ansible_module_compression' from source: unknown 11701 1727096137.60711: variable 'ansible_shell_type' from source: unknown 11701 1727096137.60713: variable 'ansible_shell_executable' from source: unknown 11701 1727096137.60715: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096137.60717: variable 'ansible_pipelining' from source: unknown 11701 1727096137.60719: variable 'ansible_timeout' from source: unknown 11701 1727096137.60721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096137.61156: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096137.61373: variable 'omit' from source: magic vars 11701 1727096137.61377: starting attempt loop 11701 1727096137.61379: running the handler 11701 1727096137.61382: _low_level_execute_command(): starting 11701 1727096137.61384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096137.63106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096137.63142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096137.63163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096137.63203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096137.63226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096137.63309: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096137.63425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096137.63589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096137.63639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096137.63856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096137.65808: stdout chunk (state=3): >>>/root <<< 11701 1727096137.65841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096137.66090: stdout chunk (state=3): >>><<< 11701 1727096137.66096: stderr chunk (state=3): >>><<< 11701 1727096137.66248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096137.66256: _low_level_execute_command(): starting 11701 1727096137.66260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160 `" && echo ansible-tmp-1727096137.661778-12682-136659952439160="` echo /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160 `" ) && sleep 0' 11701 1727096137.67196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096137.67290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096137.67452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096137.69491: stdout chunk (state=3): >>>ansible-tmp-1727096137.661778-12682-136659952439160=/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160 <<< 11701 1727096137.69843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096137.69847: stdout chunk (state=3): >>><<< 11701 1727096137.69849: stderr chunk (state=3): >>><<< 11701 1727096137.69852: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096137.661778-12682-136659952439160=/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096137.69854: variable 'ansible_module_compression' from source: unknown 11701 1727096137.69975: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096137.70020: variable 'ansible_facts' from source: unknown 11701 1727096137.70175: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py 11701 1727096137.70529: Sending initial data 11701 1727096137.70537: Sent initial data (155 bytes) 11701 1727096137.71650: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096137.71759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096137.71973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096137.71997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096137.72016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096137.72031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096137.72140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096137.73833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096137.73857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096137.73930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpdc93s58z" to remote "/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py" <<< 11701 1727096137.73940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpdc93s58z /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py <<< 11701 1727096137.75254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096137.75561: stderr chunk (state=3): >>><<< 11701 1727096137.75565: stdout chunk (state=3): >>><<< 11701 1727096137.75570: done transferring module to remote 11701 1727096137.75572: _low_level_execute_command(): starting 11701 1727096137.75574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/ /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py && sleep 0' 11701 1727096137.76830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096137.76846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096137.77011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096137.77088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096137.77138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096137.79011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096137.79279: stderr chunk (state=3): >>><<< 11701 1727096137.79284: stdout chunk (state=3): >>><<< 11701 1727096137.79288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096137.79296: _low_level_execute_command(): starting 11701 1727096137.79299: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/AnsiballZ_command.py && sleep 0' 11701 1727096137.80547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096137.80560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096137.80573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096137.80587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096137.80608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096137.80632: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096137.80837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096137.81076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096137.81088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096137.97654: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-23 08:55:37.969937", "end": "2024-09-23 08:55:37.973558", "delta": "0:00:00.003621", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096137.99475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096137.99480: stderr chunk (state=3): >>><<< 11701 1727096137.99482: stdout chunk (state=3): >>><<< 11701 1727096137.99485: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-23 08:55:37.969937", "end": "2024-09-23 08:55:37.973558", "delta": "0:00:00.003621", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096137.99507: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096137.99529: _low_level_execute_command(): starting 11701 1727096137.99534: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096137.661778-12682-136659952439160/ > /dev/null 2>&1 && sleep 0' 11701 1727096138.00482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.00486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.00488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096138.00491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.00510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.00773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.00776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.00779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.02525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.02595: stderr chunk (state=3): >>><<< 11701 1727096138.02598: stdout chunk (state=3): >>><<< 11701 1727096138.02621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.02630: handler run complete 11701 1727096138.02652: Evaluated conditional (False): False 11701 1727096138.02818: variable 'result' from source: unknown 11701 1727096138.02832: Evaluated conditional ('110' in result.stdout): True 11701 1727096138.02841: attempt loop complete, returning result 11701 1727096138.02844: _execute() done 11701 1727096138.02846: dumping result to json 11701 1727096138.02851: done dumping result, returning 11701 1727096138.02861: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [0afff68d-5257-a05c-c957-000000000071] 11701 1727096138.02866: sending task result for task 0afff68d-5257-a05c-c957-000000000071 11701 1727096138.02961: done sending task result for task 0afff68d-5257-a05c-c957-000000000071 11701 1727096138.02965: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003621", "end": "2024-09-23 08:55:37.973558", "rc": 0, "start": "2024-09-23 08:55:37.969937" } STDOUT: MII Polling Interval (ms): 110 11701 1727096138.03038: no more pending results, returning what we have 11701 1727096138.03042: results queue empty 11701 1727096138.03043: checking for any_errors_fatal 11701 1727096138.03049: done checking for any_errors_fatal 11701 1727096138.03049: checking for max_fail_percentage 11701 1727096138.03053: done checking for max_fail_percentage 11701 1727096138.03054: checking to see if all hosts have failed and the running result is not ok 11701 1727096138.03055: done checking to see if all hosts have failed 11701 1727096138.03056: getting the remaining hosts for this loop 11701 1727096138.03057: done getting the remaining hosts for this loop 11701 1727096138.03060: getting the next task for host managed_node3 11701 1727096138.03074: done getting next task for host managed_node3 11701 1727096138.03080: ^ task is: TASK: ** TEST check IPv4 11701 1727096138.03082: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096138.03087: getting variables 11701 1727096138.03088: in VariableManager get_vars() 11701 1727096138.03130: Calling all_inventory to load vars for managed_node3 11701 1727096138.03132: Calling groups_inventory to load vars for managed_node3 11701 1727096138.03135: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096138.03145: Calling all_plugins_play to load vars for managed_node3 11701 1727096138.03147: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096138.03150: Calling groups_plugins_play to load vars for managed_node3 11701 1727096138.04516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096138.05682: done with get_vars() 11701 1727096138.05716: done getting variables 11701 1727096138.05785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Monday 23 September 2024 08:55:38 -0400 (0:00:00.495) 0:00:22.022 ****** 11701 1727096138.05815: entering _queue_task() for managed_node3/command 11701 1727096138.06240: worker is 1 (out of 1 available) 11701 1727096138.06256: exiting _queue_task() for managed_node3/command 11701 1727096138.06271: done queuing things up, now waiting for results queue to drain 11701 1727096138.06272: waiting for pending results... 11701 1727096138.06685: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 11701 1727096138.06892: in run() - task 0afff68d-5257-a05c-c957-000000000072 11701 1727096138.06896: variable 'ansible_search_path' from source: unknown 11701 1727096138.06901: calling self._execute() 11701 1727096138.06904: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.06907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.06910: variable 'omit' from source: magic vars 11701 1727096138.07285: variable 'ansible_distribution_major_version' from source: facts 11701 1727096138.07295: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096138.07302: variable 'omit' from source: magic vars 11701 1727096138.07336: variable 'omit' from source: magic vars 11701 1727096138.07449: variable 'controller_device' from source: play vars 11701 1727096138.07469: variable 'omit' from source: magic vars 11701 1727096138.07564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096138.07571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096138.07591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096138.07608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096138.07613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096138.07650: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096138.07656: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.07658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.07815: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096138.07818: Set connection var ansible_timeout to 10 11701 1727096138.07821: Set connection var ansible_shell_type to sh 11701 1727096138.07825: Set connection var ansible_shell_executable to /bin/sh 11701 1727096138.07830: Set connection var ansible_connection to ssh 11701 1727096138.07834: Set connection var ansible_pipelining to False 11701 1727096138.07902: variable 'ansible_shell_executable' from source: unknown 11701 1727096138.08086: variable 'ansible_connection' from source: unknown 11701 1727096138.08094: variable 'ansible_module_compression' from source: unknown 11701 1727096138.08096: variable 'ansible_shell_type' from source: unknown 11701 1727096138.08099: variable 'ansible_shell_executable' from source: unknown 11701 1727096138.08101: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.08103: variable 'ansible_pipelining' from source: unknown 11701 1727096138.08106: variable 'ansible_timeout' from source: unknown 11701 1727096138.08108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.08487: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096138.08492: variable 'omit' from source: magic vars 11701 1727096138.08494: starting attempt loop 11701 1727096138.08497: running the handler 11701 1727096138.08499: _low_level_execute_command(): starting 11701 1727096138.08501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096138.09267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096138.09298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.09354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.09376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.09428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.11465: stdout chunk (state=3): >>>/root <<< 11701 1727096138.11474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.11485: stderr chunk (state=3): >>><<< 11701 1727096138.11490: stdout chunk (state=3): >>><<< 11701 1727096138.11561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.11812: _low_level_execute_command(): starting 11701 1727096138.11816: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882 `" && echo ansible-tmp-1727096138.1156054-12707-253487353855882="` echo /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882 `" ) && sleep 0' 11701 1727096138.12933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.12950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.12966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096138.13011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096138.13042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096138.13136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.13176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.13194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.13210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.13291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.15601: stdout chunk (state=3): >>>ansible-tmp-1727096138.1156054-12707-253487353855882=/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882 <<< 11701 1727096138.15688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.15691: stdout chunk (state=3): >>><<< 11701 1727096138.15693: stderr chunk (state=3): >>><<< 11701 1727096138.15696: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096138.1156054-12707-253487353855882=/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.15794: variable 'ansible_module_compression' from source: unknown 11701 1727096138.15798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096138.15942: variable 'ansible_facts' from source: unknown 11701 1727096138.16187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py 11701 1727096138.16793: Sending initial data 11701 1727096138.16796: Sent initial data (156 bytes) 11701 1727096138.17880: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.17989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.18119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.18378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.18408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.20113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096138.20146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096138.20184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpbq9e14b2 /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py <<< 11701 1727096138.20188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py" <<< 11701 1727096138.20420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpbq9e14b2" to remote "/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py" <<< 11701 1727096138.22121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.22207: stderr chunk (state=3): >>><<< 11701 1727096138.22217: stdout chunk (state=3): >>><<< 11701 1727096138.22292: done transferring module to remote 11701 1727096138.22318: _low_level_execute_command(): starting 11701 1727096138.22334: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/ /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py && sleep 0' 11701 1727096138.23102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.23143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.23171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.23189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.23350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.25322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.25326: stdout chunk (state=3): >>><<< 11701 1727096138.25329: stderr chunk (state=3): >>><<< 11701 1727096138.25337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.25529: _low_level_execute_command(): starting 11701 1727096138.25536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/AnsiballZ_command.py && sleep 0' 11701 1727096138.26572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.26801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.26805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.26842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.26995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.43459: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:38.427879", "end": "2024-09-23 08:55:38.431607", "delta": "0:00:00.003728", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096138.45294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096138.45389: stderr chunk (state=3): >>><<< 11701 1727096138.45393: stdout chunk (state=3): >>><<< 11701 1727096138.45412: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:38.427879", "end": "2024-09-23 08:55:38.431607", "delta": "0:00:00.003728", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096138.45464: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096138.45581: _low_level_execute_command(): starting 11701 1727096138.45584: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096138.1156054-12707-253487353855882/ > /dev/null 2>&1 && sleep 0' 11701 1727096138.46383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.46402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.46429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.46448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.46514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.48389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.48472: stderr chunk (state=3): >>><<< 11701 1727096138.48476: stdout chunk (state=3): >>><<< 11701 1727096138.48526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.48530: handler run complete 11701 1727096138.48532: Evaluated conditional (False): False 11701 1727096138.48758: variable 'result' from source: set_fact 11701 1727096138.48762: Evaluated conditional ('192.0.2' in result.stdout): True 11701 1727096138.48764: attempt loop complete, returning result 11701 1727096138.48766: _execute() done 11701 1727096138.48770: dumping result to json 11701 1727096138.48772: done dumping result, returning 11701 1727096138.48774: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0afff68d-5257-a05c-c957-000000000072] 11701 1727096138.48776: sending task result for task 0afff68d-5257-a05c-c957-000000000072 11701 1727096138.48878: done sending task result for task 0afff68d-5257-a05c-c957-000000000072 11701 1727096138.48882: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003728", "end": "2024-09-23 08:55:38.431607", "rc": 0, "start": "2024-09-23 08:55:38.427879" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 235sec preferred_lft 235sec 11701 1727096138.49026: no more pending results, returning what we have 11701 1727096138.49032: results queue empty 11701 1727096138.49033: checking for any_errors_fatal 11701 1727096138.49042: done checking for any_errors_fatal 11701 1727096138.49043: checking for max_fail_percentage 11701 1727096138.49045: done checking for max_fail_percentage 11701 1727096138.49046: checking to see if all hosts have failed and the running result is not ok 11701 1727096138.49047: done checking to see if all hosts have failed 11701 1727096138.49047: getting the remaining hosts for this loop 11701 1727096138.49049: done getting the remaining hosts for this loop 11701 1727096138.49052: getting the next task for host managed_node3 11701 1727096138.49059: done getting next task for host managed_node3 11701 1727096138.49062: ^ task is: TASK: ** TEST check IPv6 11701 1727096138.49064: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096138.49070: getting variables 11701 1727096138.49072: in VariableManager get_vars() 11701 1727096138.49115: Calling all_inventory to load vars for managed_node3 11701 1727096138.49117: Calling groups_inventory to load vars for managed_node3 11701 1727096138.49122: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096138.49592: Calling all_plugins_play to load vars for managed_node3 11701 1727096138.49595: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096138.49599: Calling groups_plugins_play to load vars for managed_node3 11701 1727096138.52034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096138.53690: done with get_vars() 11701 1727096138.53725: done getting variables 11701 1727096138.53799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Monday 23 September 2024 08:55:38 -0400 (0:00:00.480) 0:00:22.502 ****** 11701 1727096138.53831: entering _queue_task() for managed_node3/command 11701 1727096138.54301: worker is 1 (out of 1 available) 11701 1727096138.54321: exiting _queue_task() for managed_node3/command 11701 1727096138.54332: done queuing things up, now waiting for results queue to drain 11701 1727096138.54334: waiting for pending results... 11701 1727096138.54951: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 11701 1727096138.55337: in run() - task 0afff68d-5257-a05c-c957-000000000073 11701 1727096138.55341: variable 'ansible_search_path' from source: unknown 11701 1727096138.55344: calling self._execute() 11701 1727096138.55555: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.55560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.55563: variable 'omit' from source: magic vars 11701 1727096138.56153: variable 'ansible_distribution_major_version' from source: facts 11701 1727096138.56171: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096138.56177: variable 'omit' from source: magic vars 11701 1727096138.56202: variable 'omit' from source: magic vars 11701 1727096138.56310: variable 'controller_device' from source: play vars 11701 1727096138.56337: variable 'omit' from source: magic vars 11701 1727096138.56384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096138.56421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096138.56449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096138.56472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096138.56483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096138.56514: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096138.56517: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.56520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.56634: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096138.56650: Set connection var ansible_timeout to 10 11701 1727096138.56654: Set connection var ansible_shell_type to sh 11701 1727096138.56656: Set connection var ansible_shell_executable to /bin/sh 11701 1727096138.56659: Set connection var ansible_connection to ssh 11701 1727096138.56671: Set connection var ansible_pipelining to False 11701 1727096138.56769: variable 'ansible_shell_executable' from source: unknown 11701 1727096138.56773: variable 'ansible_connection' from source: unknown 11701 1727096138.56776: variable 'ansible_module_compression' from source: unknown 11701 1727096138.56779: variable 'ansible_shell_type' from source: unknown 11701 1727096138.56781: variable 'ansible_shell_executable' from source: unknown 11701 1727096138.56787: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096138.56789: variable 'ansible_pipelining' from source: unknown 11701 1727096138.56795: variable 'ansible_timeout' from source: unknown 11701 1727096138.56800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096138.56979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096138.56984: variable 'omit' from source: magic vars 11701 1727096138.57085: starting attempt loop 11701 1727096138.57089: running the handler 11701 1727096138.57301: _low_level_execute_command(): starting 11701 1727096138.57304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096138.58833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.58884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.59016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.59083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.59186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.59214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.59232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.59296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.60989: stdout chunk (state=3): >>>/root <<< 11701 1727096138.61086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.61187: stderr chunk (state=3): >>><<< 11701 1727096138.61190: stdout chunk (state=3): >>><<< 11701 1727096138.61319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.61323: _low_level_execute_command(): starting 11701 1727096138.61325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962 `" && echo ansible-tmp-1727096138.6122105-12736-63039043674962="` echo /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962 `" ) && sleep 0' 11701 1727096138.62118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.62140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.62155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096138.62173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096138.62190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096138.62287: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.62393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.62421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.62506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.64593: stdout chunk (state=3): >>>ansible-tmp-1727096138.6122105-12736-63039043674962=/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962 <<< 11701 1727096138.64616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.64675: stderr chunk (state=3): >>><<< 11701 1727096138.64688: stdout chunk (state=3): >>><<< 11701 1727096138.64718: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096138.6122105-12736-63039043674962=/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.64756: variable 'ansible_module_compression' from source: unknown 11701 1727096138.64872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096138.64876: variable 'ansible_facts' from source: unknown 11701 1727096138.64937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py 11701 1727096138.65185: Sending initial data 11701 1727096138.65188: Sent initial data (155 bytes) 11701 1727096138.65681: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.65788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096138.65802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.65824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.65901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.67555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096138.67647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096138.67672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpxyfw8fzi /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py <<< 11701 1727096138.67675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py" <<< 11701 1727096138.67728: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpxyfw8fzi" to remote "/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py" <<< 11701 1727096138.68796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.68983: stderr chunk (state=3): >>><<< 11701 1727096138.68988: stdout chunk (state=3): >>><<< 11701 1727096138.68991: done transferring module to remote 11701 1727096138.68993: _low_level_execute_command(): starting 11701 1727096138.68995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/ /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py && sleep 0' 11701 1727096138.70245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.70303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096138.70311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096138.70415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.70426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.70549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.72479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.72515: stderr chunk (state=3): >>><<< 11701 1727096138.72519: stdout chunk (state=3): >>><<< 11701 1727096138.72559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.72563: _low_level_execute_command(): starting 11701 1727096138.72571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/AnsiballZ_command.py && sleep 0' 11701 1727096138.73401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.73407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096138.73411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.73430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.73542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.89873: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::f5/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::507d:eaff:fe00:d793/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::507d:eaff:fe00:d793/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:38.892078", "end": "2024-09-23 08:55:38.895775", "delta": "0:00:00.003697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096138.91706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096138.91710: stdout chunk (state=3): >>><<< 11701 1727096138.91712: stderr chunk (state=3): >>><<< 11701 1727096138.91733: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::f5/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::507d:eaff:fe00:d793/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::507d:eaff:fe00:d793/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:55:38.892078", "end": "2024-09-23 08:55:38.895775", "delta": "0:00:00.003697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096138.91789: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096138.91873: _low_level_execute_command(): starting 11701 1727096138.91876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096138.6122105-12736-63039043674962/ > /dev/null 2>&1 && sleep 0' 11701 1727096138.92478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096138.92505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096138.92520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096138.92537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096138.92564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096138.92588: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096138.92698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096138.92723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096138.92794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096138.94688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096138.94756: stderr chunk (state=3): >>><<< 11701 1727096138.94777: stdout chunk (state=3): >>><<< 11701 1727096138.94799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096138.94811: handler run complete 11701 1727096138.94842: Evaluated conditional (False): False 11701 1727096138.95025: variable 'result' from source: set_fact 11701 1727096138.95059: Evaluated conditional ('2001' in result.stdout): True 11701 1727096138.95080: attempt loop complete, returning result 11701 1727096138.95087: _execute() done 11701 1727096138.95174: dumping result to json 11701 1727096138.95177: done dumping result, returning 11701 1727096138.95180: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0afff68d-5257-a05c-c957-000000000073] 11701 1727096138.95182: sending task result for task 0afff68d-5257-a05c-c957-000000000073 11701 1727096138.95263: done sending task result for task 0afff68d-5257-a05c-c957-000000000073 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003697", "end": "2024-09-23 08:55:38.895775", "rc": 0, "start": "2024-09-23 08:55:38.892078" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::f5/128 scope global dynamic noprefixroute valid_lft 234sec preferred_lft 234sec inet6 2001:db8::507d:eaff:fe00:d793/64 scope global dynamic noprefixroute valid_lft 1798sec preferred_lft 1798sec inet6 fe80::507d:eaff:fe00:d793/64 scope link noprefixroute valid_lft forever preferred_lft forever 11701 1727096138.95358: no more pending results, returning what we have 11701 1727096138.95362: results queue empty 11701 1727096138.95363: checking for any_errors_fatal 11701 1727096138.95374: done checking for any_errors_fatal 11701 1727096138.95375: checking for max_fail_percentage 11701 1727096138.95377: done checking for max_fail_percentage 11701 1727096138.95378: checking to see if all hosts have failed and the running result is not ok 11701 1727096138.95379: done checking to see if all hosts have failed 11701 1727096138.95380: getting the remaining hosts for this loop 11701 1727096138.95381: done getting the remaining hosts for this loop 11701 1727096138.95384: getting the next task for host managed_node3 11701 1727096138.95396: done getting next task for host managed_node3 11701 1727096138.95401: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11701 1727096138.95404: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096138.95416: WORKER PROCESS EXITING 11701 1727096138.95555: getting variables 11701 1727096138.95557: in VariableManager get_vars() 11701 1727096138.95674: Calling all_inventory to load vars for managed_node3 11701 1727096138.95677: Calling groups_inventory to load vars for managed_node3 11701 1727096138.95680: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096138.95689: Calling all_plugins_play to load vars for managed_node3 11701 1727096138.95692: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096138.95695: Calling groups_plugins_play to load vars for managed_node3 11701 1727096138.97065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096138.98682: done with get_vars() 11701 1727096138.98730: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:38 -0400 (0:00:00.450) 0:00:22.953 ****** 11701 1727096138.98895: entering _queue_task() for managed_node3/include_tasks 11701 1727096138.99394: worker is 1 (out of 1 available) 11701 1727096138.99411: exiting _queue_task() for managed_node3/include_tasks 11701 1727096138.99424: done queuing things up, now waiting for results queue to drain 11701 1727096138.99425: waiting for pending results... 11701 1727096138.99757: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11701 1727096138.99966: in run() - task 0afff68d-5257-a05c-c957-00000000007c 11701 1727096138.99994: variable 'ansible_search_path' from source: unknown 11701 1727096139.00005: variable 'ansible_search_path' from source: unknown 11701 1727096139.00047: calling self._execute() 11701 1727096139.00148: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.00162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.00180: variable 'omit' from source: magic vars 11701 1727096139.00560: variable 'ansible_distribution_major_version' from source: facts 11701 1727096139.00581: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096139.00651: _execute() done 11701 1727096139.00654: dumping result to json 11701 1727096139.00657: done dumping result, returning 11701 1727096139.00659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a05c-c957-00000000007c] 11701 1727096139.00661: sending task result for task 0afff68d-5257-a05c-c957-00000000007c 11701 1727096139.00741: done sending task result for task 0afff68d-5257-a05c-c957-00000000007c 11701 1727096139.00744: WORKER PROCESS EXITING 11701 1727096139.00797: no more pending results, returning what we have 11701 1727096139.00803: in VariableManager get_vars() 11701 1727096139.00855: Calling all_inventory to load vars for managed_node3 11701 1727096139.00858: Calling groups_inventory to load vars for managed_node3 11701 1727096139.00861: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096139.00877: Calling all_plugins_play to load vars for managed_node3 11701 1727096139.00879: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096139.00882: Calling groups_plugins_play to load vars for managed_node3 11701 1727096139.03416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096139.06723: done with get_vars() 11701 1727096139.06755: variable 'ansible_search_path' from source: unknown 11701 1727096139.06757: variable 'ansible_search_path' from source: unknown 11701 1727096139.06802: we have included files to process 11701 1727096139.06804: generating all_blocks data 11701 1727096139.06806: done generating all_blocks data 11701 1727096139.06812: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096139.06813: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096139.06816: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11701 1727096139.08295: done processing included file 11701 1727096139.08298: iterating over new_blocks loaded from include file 11701 1727096139.08299: in VariableManager get_vars() 11701 1727096139.08332: done with get_vars() 11701 1727096139.08334: filtering new block on tags 11701 1727096139.08374: done filtering new block on tags 11701 1727096139.08491: in VariableManager get_vars() 11701 1727096139.08519: done with get_vars() 11701 1727096139.08521: filtering new block on tags 11701 1727096139.08632: done filtering new block on tags 11701 1727096139.08636: in VariableManager get_vars() 11701 1727096139.08664: done with get_vars() 11701 1727096139.08667: filtering new block on tags 11701 1727096139.08850: done filtering new block on tags 11701 1727096139.08855: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11701 1727096139.08860: extending task lists for all hosts with included blocks 11701 1727096139.11243: done extending task lists 11701 1727096139.11245: done processing included files 11701 1727096139.11246: results queue empty 11701 1727096139.11247: checking for any_errors_fatal 11701 1727096139.11254: done checking for any_errors_fatal 11701 1727096139.11255: checking for max_fail_percentage 11701 1727096139.11256: done checking for max_fail_percentage 11701 1727096139.11257: checking to see if all hosts have failed and the running result is not ok 11701 1727096139.11258: done checking to see if all hosts have failed 11701 1727096139.11259: getting the remaining hosts for this loop 11701 1727096139.11260: done getting the remaining hosts for this loop 11701 1727096139.11263: getting the next task for host managed_node3 11701 1727096139.11269: done getting next task for host managed_node3 11701 1727096139.11273: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11701 1727096139.11277: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096139.11375: getting variables 11701 1727096139.11377: in VariableManager get_vars() 11701 1727096139.11402: Calling all_inventory to load vars for managed_node3 11701 1727096139.11404: Calling groups_inventory to load vars for managed_node3 11701 1727096139.11406: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096139.11412: Calling all_plugins_play to load vars for managed_node3 11701 1727096139.11415: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096139.11417: Calling groups_plugins_play to load vars for managed_node3 11701 1727096139.13923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096139.17928: done with get_vars() 11701 1727096139.17960: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:55:39 -0400 (0:00:00.192) 0:00:23.146 ****** 11701 1727096139.18172: entering _queue_task() for managed_node3/setup 11701 1727096139.18994: worker is 1 (out of 1 available) 11701 1727096139.19005: exiting _queue_task() for managed_node3/setup 11701 1727096139.19012: done queuing things up, now waiting for results queue to drain 11701 1727096139.19013: waiting for pending results... 11701 1727096139.19285: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11701 1727096139.19344: in run() - task 0afff68d-5257-a05c-c957-000000000491 11701 1727096139.19364: variable 'ansible_search_path' from source: unknown 11701 1727096139.19372: variable 'ansible_search_path' from source: unknown 11701 1727096139.19415: calling self._execute() 11701 1727096139.19527: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.19592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.19597: variable 'omit' from source: magic vars 11701 1727096139.19933: variable 'ansible_distribution_major_version' from source: facts 11701 1727096139.19945: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096139.20172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096139.24079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096139.24293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096139.24297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096139.24326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096139.24351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096139.24473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096139.24494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096139.24520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096139.24565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096139.24581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096139.24638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096139.24664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096139.24691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096139.24732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096139.24747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096139.25036: variable '__network_required_facts' from source: role '' defaults 11701 1727096139.25040: variable 'ansible_facts' from source: unknown 11701 1727096139.25837: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11701 1727096139.25842: when evaluation is False, skipping this task 11701 1727096139.25845: _execute() done 11701 1727096139.25847: dumping result to json 11701 1727096139.25849: done dumping result, returning 11701 1727096139.25860: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-a05c-c957-000000000491] 11701 1727096139.25863: sending task result for task 0afff68d-5257-a05c-c957-000000000491 11701 1727096139.25963: done sending task result for task 0afff68d-5257-a05c-c957-000000000491 11701 1727096139.26098: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096139.26157: no more pending results, returning what we have 11701 1727096139.26162: results queue empty 11701 1727096139.26164: checking for any_errors_fatal 11701 1727096139.26165: done checking for any_errors_fatal 11701 1727096139.26166: checking for max_fail_percentage 11701 1727096139.26170: done checking for max_fail_percentage 11701 1727096139.26171: checking to see if all hosts have failed and the running result is not ok 11701 1727096139.26172: done checking to see if all hosts have failed 11701 1727096139.26173: getting the remaining hosts for this loop 11701 1727096139.26174: done getting the remaining hosts for this loop 11701 1727096139.26179: getting the next task for host managed_node3 11701 1727096139.26189: done getting next task for host managed_node3 11701 1727096139.26193: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11701 1727096139.26199: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096139.26218: getting variables 11701 1727096139.26220: in VariableManager get_vars() 11701 1727096139.26564: Calling all_inventory to load vars for managed_node3 11701 1727096139.26569: Calling groups_inventory to load vars for managed_node3 11701 1727096139.26573: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096139.26584: Calling all_plugins_play to load vars for managed_node3 11701 1727096139.26586: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096139.26589: Calling groups_plugins_play to load vars for managed_node3 11701 1727096139.28794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096139.30691: done with get_vars() 11701 1727096139.30722: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:55:39 -0400 (0:00:00.126) 0:00:23.272 ****** 11701 1727096139.30835: entering _queue_task() for managed_node3/stat 11701 1727096139.31618: worker is 1 (out of 1 available) 11701 1727096139.31633: exiting _queue_task() for managed_node3/stat 11701 1727096139.31644: done queuing things up, now waiting for results queue to drain 11701 1727096139.31645: waiting for pending results... 11701 1727096139.31903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11701 1727096139.32099: in run() - task 0afff68d-5257-a05c-c957-000000000493 11701 1727096139.32174: variable 'ansible_search_path' from source: unknown 11701 1727096139.32178: variable 'ansible_search_path' from source: unknown 11701 1727096139.32182: calling self._execute() 11701 1727096139.32270: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.32286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.32312: variable 'omit' from source: magic vars 11701 1727096139.32698: variable 'ansible_distribution_major_version' from source: facts 11701 1727096139.32717: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096139.32893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096139.33174: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096139.33229: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096139.33288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096139.33315: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096139.33506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096139.33513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096139.33516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096139.33519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096139.33601: variable '__network_is_ostree' from source: set_fact 11701 1727096139.33624: Evaluated conditional (not __network_is_ostree is defined): False 11701 1727096139.33631: when evaluation is False, skipping this task 11701 1727096139.33637: _execute() done 11701 1727096139.33642: dumping result to json 11701 1727096139.33648: done dumping result, returning 11701 1727096139.33659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-a05c-c957-000000000493] 11701 1727096139.33668: sending task result for task 0afff68d-5257-a05c-c957-000000000493 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11701 1727096139.33875: no more pending results, returning what we have 11701 1727096139.33879: results queue empty 11701 1727096139.33880: checking for any_errors_fatal 11701 1727096139.33888: done checking for any_errors_fatal 11701 1727096139.33888: checking for max_fail_percentage 11701 1727096139.33890: done checking for max_fail_percentage 11701 1727096139.33891: checking to see if all hosts have failed and the running result is not ok 11701 1727096139.33892: done checking to see if all hosts have failed 11701 1727096139.33893: getting the remaining hosts for this loop 11701 1727096139.33894: done getting the remaining hosts for this loop 11701 1727096139.33897: getting the next task for host managed_node3 11701 1727096139.33911: done getting next task for host managed_node3 11701 1727096139.33915: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11701 1727096139.33922: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096139.33944: getting variables 11701 1727096139.33945: in VariableManager get_vars() 11701 1727096139.34374: Calling all_inventory to load vars for managed_node3 11701 1727096139.34377: Calling groups_inventory to load vars for managed_node3 11701 1727096139.34380: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096139.34389: Calling all_plugins_play to load vars for managed_node3 11701 1727096139.34392: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096139.34394: Calling groups_plugins_play to load vars for managed_node3 11701 1727096139.34954: done sending task result for task 0afff68d-5257-a05c-c957-000000000493 11701 1727096139.34959: WORKER PROCESS EXITING 11701 1727096139.36490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096139.38665: done with get_vars() 11701 1727096139.38740: done getting variables 11701 1727096139.38873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:55:39 -0400 (0:00:00.080) 0:00:23.353 ****** 11701 1727096139.38911: entering _queue_task() for managed_node3/set_fact 11701 1727096139.39537: worker is 1 (out of 1 available) 11701 1727096139.39549: exiting _queue_task() for managed_node3/set_fact 11701 1727096139.39564: done queuing things up, now waiting for results queue to drain 11701 1727096139.39566: waiting for pending results... 11701 1727096139.40070: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11701 1727096139.40473: in run() - task 0afff68d-5257-a05c-c957-000000000494 11701 1727096139.40479: variable 'ansible_search_path' from source: unknown 11701 1727096139.40483: variable 'ansible_search_path' from source: unknown 11701 1727096139.40512: calling self._execute() 11701 1727096139.40688: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.40692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.40695: variable 'omit' from source: magic vars 11701 1727096139.41055: variable 'ansible_distribution_major_version' from source: facts 11701 1727096139.41086: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096139.41251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096139.41529: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096139.41575: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096139.41612: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096139.41646: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096139.41733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096139.41762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096139.41792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096139.41819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096139.41917: variable '__network_is_ostree' from source: set_fact 11701 1727096139.41924: Evaluated conditional (not __network_is_ostree is defined): False 11701 1727096139.41928: when evaluation is False, skipping this task 11701 1727096139.41930: _execute() done 11701 1727096139.41933: dumping result to json 11701 1727096139.41935: done dumping result, returning 11701 1727096139.41945: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-a05c-c957-000000000494] 11701 1727096139.41948: sending task result for task 0afff68d-5257-a05c-c957-000000000494 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11701 1727096139.42099: no more pending results, returning what we have 11701 1727096139.42103: results queue empty 11701 1727096139.42105: checking for any_errors_fatal 11701 1727096139.42113: done checking for any_errors_fatal 11701 1727096139.42114: checking for max_fail_percentage 11701 1727096139.42116: done checking for max_fail_percentage 11701 1727096139.42117: checking to see if all hosts have failed and the running result is not ok 11701 1727096139.42118: done checking to see if all hosts have failed 11701 1727096139.42119: getting the remaining hosts for this loop 11701 1727096139.42120: done getting the remaining hosts for this loop 11701 1727096139.42124: getting the next task for host managed_node3 11701 1727096139.42134: done getting next task for host managed_node3 11701 1727096139.42139: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11701 1727096139.42144: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096139.42166: getting variables 11701 1727096139.42170: in VariableManager get_vars() 11701 1727096139.42216: Calling all_inventory to load vars for managed_node3 11701 1727096139.42219: Calling groups_inventory to load vars for managed_node3 11701 1727096139.42222: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096139.42235: Calling all_plugins_play to load vars for managed_node3 11701 1727096139.42238: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096139.42242: Calling groups_plugins_play to load vars for managed_node3 11701 1727096139.42875: done sending task result for task 0afff68d-5257-a05c-c957-000000000494 11701 1727096139.43391: WORKER PROCESS EXITING 11701 1727096139.44229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096139.45119: done with get_vars() 11701 1727096139.45141: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:55:39 -0400 (0:00:00.063) 0:00:23.416 ****** 11701 1727096139.45221: entering _queue_task() for managed_node3/service_facts 11701 1727096139.45494: worker is 1 (out of 1 available) 11701 1727096139.45508: exiting _queue_task() for managed_node3/service_facts 11701 1727096139.45520: done queuing things up, now waiting for results queue to drain 11701 1727096139.45522: waiting for pending results... 11701 1727096139.45714: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11701 1727096139.45829: in run() - task 0afff68d-5257-a05c-c957-000000000496 11701 1727096139.45842: variable 'ansible_search_path' from source: unknown 11701 1727096139.45846: variable 'ansible_search_path' from source: unknown 11701 1727096139.45914: calling self._execute() 11701 1727096139.46022: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.46026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.46028: variable 'omit' from source: magic vars 11701 1727096139.46484: variable 'ansible_distribution_major_version' from source: facts 11701 1727096139.46488: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096139.46491: variable 'omit' from source: magic vars 11701 1727096139.46493: variable 'omit' from source: magic vars 11701 1727096139.46506: variable 'omit' from source: magic vars 11701 1727096139.46548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096139.46582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096139.46603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096139.46625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096139.46640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096139.46673: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096139.46676: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.46679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.46789: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096139.46795: Set connection var ansible_timeout to 10 11701 1727096139.46798: Set connection var ansible_shell_type to sh 11701 1727096139.46810: Set connection var ansible_shell_executable to /bin/sh 11701 1727096139.46813: Set connection var ansible_connection to ssh 11701 1727096139.46815: Set connection var ansible_pipelining to False 11701 1727096139.46873: variable 'ansible_shell_executable' from source: unknown 11701 1727096139.46876: variable 'ansible_connection' from source: unknown 11701 1727096139.46879: variable 'ansible_module_compression' from source: unknown 11701 1727096139.46881: variable 'ansible_shell_type' from source: unknown 11701 1727096139.46883: variable 'ansible_shell_executable' from source: unknown 11701 1727096139.46884: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096139.46886: variable 'ansible_pipelining' from source: unknown 11701 1727096139.46888: variable 'ansible_timeout' from source: unknown 11701 1727096139.46890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096139.47046: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096139.47056: variable 'omit' from source: magic vars 11701 1727096139.47059: starting attempt loop 11701 1727096139.47062: running the handler 11701 1727096139.47086: _low_level_execute_command(): starting 11701 1727096139.47089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096139.48380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.48501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096139.48504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096139.48594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096139.50293: stdout chunk (state=3): >>>/root <<< 11701 1727096139.50391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096139.50434: stderr chunk (state=3): >>><<< 11701 1727096139.50441: stdout chunk (state=3): >>><<< 11701 1727096139.50457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096139.50475: _low_level_execute_command(): starting 11701 1727096139.50482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688 `" && echo ansible-tmp-1727096139.5046017-12785-74924782844688="` echo /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688 `" ) && sleep 0' 11701 1727096139.50940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.50944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096139.50983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.50988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.51008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.51102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096139.51122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096139.51463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096139.53297: stdout chunk (state=3): >>>ansible-tmp-1727096139.5046017-12785-74924782844688=/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688 <<< 11701 1727096139.53417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096139.53447: stderr chunk (state=3): >>><<< 11701 1727096139.53461: stdout chunk (state=3): >>><<< 11701 1727096139.53492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096139.5046017-12785-74924782844688=/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096139.53562: variable 'ansible_module_compression' from source: unknown 11701 1727096139.53616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11701 1727096139.53675: variable 'ansible_facts' from source: unknown 11701 1727096139.53811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py 11701 1727096139.54009: Sending initial data 11701 1727096139.54012: Sent initial data (161 bytes) 11701 1727096139.54486: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.54490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096139.54492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.54496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.54498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.54550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096139.54553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096139.54556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096139.54605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096139.56263: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096139.56298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096139.56338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpypfjg_77 /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py <<< 11701 1727096139.56352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py" <<< 11701 1727096139.56391: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11701 1727096139.56395: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpypfjg_77" to remote "/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py" <<< 11701 1727096139.57058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096139.57110: stderr chunk (state=3): >>><<< 11701 1727096139.57114: stdout chunk (state=3): >>><<< 11701 1727096139.57247: done transferring module to remote 11701 1727096139.57254: _low_level_execute_command(): starting 11701 1727096139.57257: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/ /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py && sleep 0' 11701 1727096139.58073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.58122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096139.58166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11701 1727096139.58277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096139.58311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096139.58423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096139.60281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096139.60286: stdout chunk (state=3): >>><<< 11701 1727096139.60294: stderr chunk (state=3): >>><<< 11701 1727096139.60309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096139.60312: _low_level_execute_command(): starting 11701 1727096139.60316: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/AnsiballZ_service_facts.py && sleep 0' 11701 1727096139.60818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096139.60822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.60837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096139.60913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096139.60945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096139.60984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.25611: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11701 1727096141.25624: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 11701 1727096141.25631: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11701 1727096141.27340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096141.27344: stdout chunk (state=3): >>><<< 11701 1727096141.27347: stderr chunk (state=3): >>><<< 11701 1727096141.27379: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096141.28650: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096141.28657: _low_level_execute_command(): starting 11701 1727096141.28660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096139.5046017-12785-74924782844688/ > /dev/null 2>&1 && sleep 0' 11701 1727096141.29323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096141.29490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.29501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.31359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096141.31586: stderr chunk (state=3): >>><<< 11701 1727096141.31590: stdout chunk (state=3): >>><<< 11701 1727096141.31611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096141.31616: handler run complete 11701 1727096141.32121: variable 'ansible_facts' from source: unknown 11701 1727096141.32124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096141.32565: variable 'ansible_facts' from source: unknown 11701 1727096141.32571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096141.32875: attempt loop complete, returning result 11701 1727096141.32879: _execute() done 11701 1727096141.32881: dumping result to json 11701 1727096141.32883: done dumping result, returning 11701 1727096141.32885: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-a05c-c957-000000000496] 11701 1727096141.32887: sending task result for task 0afff68d-5257-a05c-c957-000000000496 11701 1727096141.34103: done sending task result for task 0afff68d-5257-a05c-c957-000000000496 11701 1727096141.34107: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096141.34187: no more pending results, returning what we have 11701 1727096141.34190: results queue empty 11701 1727096141.34191: checking for any_errors_fatal 11701 1727096141.34195: done checking for any_errors_fatal 11701 1727096141.34195: checking for max_fail_percentage 11701 1727096141.34197: done checking for max_fail_percentage 11701 1727096141.34198: checking to see if all hosts have failed and the running result is not ok 11701 1727096141.34198: done checking to see if all hosts have failed 11701 1727096141.34199: getting the remaining hosts for this loop 11701 1727096141.34200: done getting the remaining hosts for this loop 11701 1727096141.34204: getting the next task for host managed_node3 11701 1727096141.34210: done getting next task for host managed_node3 11701 1727096141.34215: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11701 1727096141.34221: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096141.34232: getting variables 11701 1727096141.34233: in VariableManager get_vars() 11701 1727096141.34278: Calling all_inventory to load vars for managed_node3 11701 1727096141.34281: Calling groups_inventory to load vars for managed_node3 11701 1727096141.34284: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096141.34293: Calling all_plugins_play to load vars for managed_node3 11701 1727096141.34296: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096141.34299: Calling groups_plugins_play to load vars for managed_node3 11701 1727096141.35835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096141.37664: done with get_vars() 11701 1727096141.37700: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:55:41 -0400 (0:00:01.925) 0:00:25.342 ****** 11701 1727096141.37825: entering _queue_task() for managed_node3/package_facts 11701 1727096141.38412: worker is 1 (out of 1 available) 11701 1727096141.38426: exiting _queue_task() for managed_node3/package_facts 11701 1727096141.38439: done queuing things up, now waiting for results queue to drain 11701 1727096141.38441: waiting for pending results... 11701 1727096141.38652: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11701 1727096141.38975: in run() - task 0afff68d-5257-a05c-c957-000000000497 11701 1727096141.38979: variable 'ansible_search_path' from source: unknown 11701 1727096141.38982: variable 'ansible_search_path' from source: unknown 11701 1727096141.38984: calling self._execute() 11701 1727096141.39005: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096141.39011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096141.39021: variable 'omit' from source: magic vars 11701 1727096141.39449: variable 'ansible_distribution_major_version' from source: facts 11701 1727096141.39465: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096141.39473: variable 'omit' from source: magic vars 11701 1727096141.39570: variable 'omit' from source: magic vars 11701 1727096141.39612: variable 'omit' from source: magic vars 11701 1727096141.39665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096141.39704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096141.39732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096141.39750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096141.39776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096141.39832: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096141.39836: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096141.39838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096141.40074: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096141.40081: Set connection var ansible_timeout to 10 11701 1727096141.40084: Set connection var ansible_shell_type to sh 11701 1727096141.40087: Set connection var ansible_shell_executable to /bin/sh 11701 1727096141.40089: Set connection var ansible_connection to ssh 11701 1727096141.40091: Set connection var ansible_pipelining to False 11701 1727096141.40097: variable 'ansible_shell_executable' from source: unknown 11701 1727096141.40101: variable 'ansible_connection' from source: unknown 11701 1727096141.40103: variable 'ansible_module_compression' from source: unknown 11701 1727096141.40106: variable 'ansible_shell_type' from source: unknown 11701 1727096141.40107: variable 'ansible_shell_executable' from source: unknown 11701 1727096141.40109: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096141.40111: variable 'ansible_pipelining' from source: unknown 11701 1727096141.40113: variable 'ansible_timeout' from source: unknown 11701 1727096141.40115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096141.40559: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096141.40563: variable 'omit' from source: magic vars 11701 1727096141.40787: starting attempt loop 11701 1727096141.40790: running the handler 11701 1727096141.40793: _low_level_execute_command(): starting 11701 1727096141.40795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096141.42127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096141.42186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.42251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096141.42292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.42327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.44108: stdout chunk (state=3): >>>/root <<< 11701 1727096141.44791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096141.45009: stderr chunk (state=3): >>><<< 11701 1727096141.45013: stdout chunk (state=3): >>><<< 11701 1727096141.45017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096141.45020: _low_level_execute_command(): starting 11701 1727096141.45022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558 `" && echo ansible-tmp-1727096141.449115-12883-169355015176558="` echo /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558 `" ) && sleep 0' 11701 1727096141.46200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.46537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.46575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.48579: stdout chunk (state=3): >>>ansible-tmp-1727096141.449115-12883-169355015176558=/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558 <<< 11701 1727096141.48679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096141.48729: stderr chunk (state=3): >>><<< 11701 1727096141.48740: stdout chunk (state=3): >>><<< 11701 1727096141.48770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096141.449115-12883-169355015176558=/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096141.48823: variable 'ansible_module_compression' from source: unknown 11701 1727096141.49073: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11701 1727096141.49473: variable 'ansible_facts' from source: unknown 11701 1727096141.49504: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py 11701 1727096141.49799: Sending initial data 11701 1727096141.49802: Sent initial data (161 bytes) 11701 1727096141.51276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.51281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096141.51283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096141.51321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.51495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.53161: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096141.53317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096141.53530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjmz_9mgs /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py <<< 11701 1727096141.53534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py" <<< 11701 1727096141.53552: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjmz_9mgs" to remote "/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py" <<< 11701 1727096141.56274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096141.56278: stdout chunk (state=3): >>><<< 11701 1727096141.56285: stderr chunk (state=3): >>><<< 11701 1727096141.56478: done transferring module to remote 11701 1727096141.56489: _low_level_execute_command(): starting 11701 1727096141.56493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/ /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py && sleep 0' 11701 1727096141.57846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096141.58074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096141.58078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096141.58081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096141.58083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096141.58085: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096141.58087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.58089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096141.58091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096141.58095: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.58097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096141.58099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096141.58147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.58219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096141.60404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096141.60408: stdout chunk (state=3): >>><<< 11701 1727096141.60410: stderr chunk (state=3): >>><<< 11701 1727096141.60427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096141.60431: _low_level_execute_command(): starting 11701 1727096141.60435: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/AnsiballZ_package_facts.py && sleep 0' 11701 1727096141.62009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096141.62076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096141.62090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096141.62101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096141.62172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096142.07221: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11701 1727096142.07237: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 11701 1727096142.07623: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11701 1727096142.07638: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11701 1727096142.07641: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11701 1727096142.09322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096142.09435: stdout chunk (state=3): >>><<< 11701 1727096142.09441: stderr chunk (state=3): >>><<< 11701 1727096142.09677: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096142.14841: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096142.14961: _low_level_execute_command(): starting 11701 1727096142.14965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096141.449115-12883-169355015176558/ > /dev/null 2>&1 && sleep 0' 11701 1727096142.15544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096142.15560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096142.15578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096142.15616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 11701 1727096142.15629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096142.15713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096142.15735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096142.15747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096142.15854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096142.17839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096142.17843: stdout chunk (state=3): >>><<< 11701 1727096142.17848: stderr chunk (state=3): >>><<< 11701 1727096142.17864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096142.17872: handler run complete 11701 1727096142.19974: variable 'ansible_facts' from source: unknown 11701 1727096142.20259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.23747: variable 'ansible_facts' from source: unknown 11701 1727096142.24522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.25197: attempt loop complete, returning result 11701 1727096142.25215: _execute() done 11701 1727096142.25224: dumping result to json 11701 1727096142.25440: done dumping result, returning 11701 1727096142.25459: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-a05c-c957-000000000497] 11701 1727096142.25473: sending task result for task 0afff68d-5257-a05c-c957-000000000497 11701 1727096142.28122: done sending task result for task 0afff68d-5257-a05c-c957-000000000497 11701 1727096142.28125: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096142.28256: no more pending results, returning what we have 11701 1727096142.28259: results queue empty 11701 1727096142.28260: checking for any_errors_fatal 11701 1727096142.28264: done checking for any_errors_fatal 11701 1727096142.28265: checking for max_fail_percentage 11701 1727096142.28267: done checking for max_fail_percentage 11701 1727096142.28270: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.28272: done checking to see if all hosts have failed 11701 1727096142.28272: getting the remaining hosts for this loop 11701 1727096142.28273: done getting the remaining hosts for this loop 11701 1727096142.28276: getting the next task for host managed_node3 11701 1727096142.28283: done getting next task for host managed_node3 11701 1727096142.28286: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11701 1727096142.28290: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.28375: getting variables 11701 1727096142.28377: in VariableManager get_vars() 11701 1727096142.28424: Calling all_inventory to load vars for managed_node3 11701 1727096142.28427: Calling groups_inventory to load vars for managed_node3 11701 1727096142.28429: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.28438: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.28440: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.28443: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.31499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.33347: done with get_vars() 11701 1727096142.33385: done getting variables 11701 1727096142.33446: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:42 -0400 (0:00:00.957) 0:00:26.299 ****** 11701 1727096142.33541: entering _queue_task() for managed_node3/debug 11701 1727096142.34295: worker is 1 (out of 1 available) 11701 1727096142.34313: exiting _queue_task() for managed_node3/debug 11701 1727096142.34324: done queuing things up, now waiting for results queue to drain 11701 1727096142.34325: waiting for pending results... 11701 1727096142.34866: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11701 1727096142.35207: in run() - task 0afff68d-5257-a05c-c957-00000000007d 11701 1727096142.35230: variable 'ansible_search_path' from source: unknown 11701 1727096142.35239: variable 'ansible_search_path' from source: unknown 11701 1727096142.35287: calling self._execute() 11701 1727096142.35579: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.35773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.35777: variable 'omit' from source: magic vars 11701 1727096142.36373: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.36394: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.36404: variable 'omit' from source: magic vars 11701 1727096142.36479: variable 'omit' from source: magic vars 11701 1727096142.36780: variable 'network_provider' from source: set_fact 11701 1727096142.36803: variable 'omit' from source: magic vars 11701 1727096142.36850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096142.37173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096142.37176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096142.37179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096142.37181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096142.37205: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096142.37214: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.37222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.37333: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096142.37672: Set connection var ansible_timeout to 10 11701 1727096142.37675: Set connection var ansible_shell_type to sh 11701 1727096142.37678: Set connection var ansible_shell_executable to /bin/sh 11701 1727096142.37680: Set connection var ansible_connection to ssh 11701 1727096142.37682: Set connection var ansible_pipelining to False 11701 1727096142.37683: variable 'ansible_shell_executable' from source: unknown 11701 1727096142.37686: variable 'ansible_connection' from source: unknown 11701 1727096142.37689: variable 'ansible_module_compression' from source: unknown 11701 1727096142.37691: variable 'ansible_shell_type' from source: unknown 11701 1727096142.37694: variable 'ansible_shell_executable' from source: unknown 11701 1727096142.37696: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.37699: variable 'ansible_pipelining' from source: unknown 11701 1727096142.37701: variable 'ansible_timeout' from source: unknown 11701 1727096142.37704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.37826: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096142.38172: variable 'omit' from source: magic vars 11701 1727096142.38175: starting attempt loop 11701 1727096142.38178: running the handler 11701 1727096142.38181: handler run complete 11701 1727096142.38184: attempt loop complete, returning result 11701 1727096142.38187: _execute() done 11701 1727096142.38190: dumping result to json 11701 1727096142.38193: done dumping result, returning 11701 1727096142.38195: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a05c-c957-00000000007d] 11701 1727096142.38373: sending task result for task 0afff68d-5257-a05c-c957-00000000007d 11701 1727096142.38445: done sending task result for task 0afff68d-5257-a05c-c957-00000000007d 11701 1727096142.38449: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11701 1727096142.38534: no more pending results, returning what we have 11701 1727096142.38539: results queue empty 11701 1727096142.38540: checking for any_errors_fatal 11701 1727096142.38549: done checking for any_errors_fatal 11701 1727096142.38549: checking for max_fail_percentage 11701 1727096142.38551: done checking for max_fail_percentage 11701 1727096142.38552: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.38554: done checking to see if all hosts have failed 11701 1727096142.38554: getting the remaining hosts for this loop 11701 1727096142.38556: done getting the remaining hosts for this loop 11701 1727096142.38560: getting the next task for host managed_node3 11701 1727096142.38572: done getting next task for host managed_node3 11701 1727096142.38576: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11701 1727096142.38581: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.38595: getting variables 11701 1727096142.38597: in VariableManager get_vars() 11701 1727096142.38752: Calling all_inventory to load vars for managed_node3 11701 1727096142.38755: Calling groups_inventory to load vars for managed_node3 11701 1727096142.38757: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.38770: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.38773: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.38775: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.49754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.51629: done with get_vars() 11701 1727096142.51663: done getting variables 11701 1727096142.51828: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:42 -0400 (0:00:00.183) 0:00:26.483 ****** 11701 1727096142.51864: entering _queue_task() for managed_node3/fail 11701 1727096142.52623: worker is 1 (out of 1 available) 11701 1727096142.52637: exiting _queue_task() for managed_node3/fail 11701 1727096142.52649: done queuing things up, now waiting for results queue to drain 11701 1727096142.52872: waiting for pending results... 11701 1727096142.53201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11701 1727096142.53637: in run() - task 0afff68d-5257-a05c-c957-00000000007e 11701 1727096142.53697: variable 'ansible_search_path' from source: unknown 11701 1727096142.53704: variable 'ansible_search_path' from source: unknown 11701 1727096142.53790: calling self._execute() 11701 1727096142.53996: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.54015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.54032: variable 'omit' from source: magic vars 11701 1727096142.54984: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.54990: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.55232: variable 'network_state' from source: role '' defaults 11701 1727096142.55286: Evaluated conditional (network_state != {}): False 11701 1727096142.55296: when evaluation is False, skipping this task 11701 1727096142.55475: _execute() done 11701 1727096142.55478: dumping result to json 11701 1727096142.55480: done dumping result, returning 11701 1727096142.55482: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a05c-c957-00000000007e] 11701 1727096142.55485: sending task result for task 0afff68d-5257-a05c-c957-00000000007e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096142.55662: no more pending results, returning what we have 11701 1727096142.55666: results queue empty 11701 1727096142.55669: checking for any_errors_fatal 11701 1727096142.55676: done checking for any_errors_fatal 11701 1727096142.55677: checking for max_fail_percentage 11701 1727096142.55678: done checking for max_fail_percentage 11701 1727096142.55679: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.55680: done checking to see if all hosts have failed 11701 1727096142.55681: getting the remaining hosts for this loop 11701 1727096142.55682: done getting the remaining hosts for this loop 11701 1727096142.55686: getting the next task for host managed_node3 11701 1727096142.55694: done getting next task for host managed_node3 11701 1727096142.55697: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11701 1727096142.55701: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.55720: getting variables 11701 1727096142.55722: in VariableManager get_vars() 11701 1727096142.55764: Calling all_inventory to load vars for managed_node3 11701 1727096142.55766: Calling groups_inventory to load vars for managed_node3 11701 1727096142.55997: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.56012: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.56016: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.56019: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.56683: done sending task result for task 0afff68d-5257-a05c-c957-00000000007e 11701 1727096142.56686: WORKER PROCESS EXITING 11701 1727096142.57680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.59722: done with get_vars() 11701 1727096142.59752: done getting variables 11701 1727096142.59820: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:42 -0400 (0:00:00.079) 0:00:26.563 ****** 11701 1727096142.59859: entering _queue_task() for managed_node3/fail 11701 1727096142.60251: worker is 1 (out of 1 available) 11701 1727096142.60266: exiting _queue_task() for managed_node3/fail 11701 1727096142.60282: done queuing things up, now waiting for results queue to drain 11701 1727096142.60283: waiting for pending results... 11701 1727096142.60566: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11701 1727096142.61074: in run() - task 0afff68d-5257-a05c-c957-00000000007f 11701 1727096142.61079: variable 'ansible_search_path' from source: unknown 11701 1727096142.61081: variable 'ansible_search_path' from source: unknown 11701 1727096142.61084: calling self._execute() 11701 1727096142.61159: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.61207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.61374: variable 'omit' from source: magic vars 11701 1727096142.62201: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.62275: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.62544: variable 'network_state' from source: role '' defaults 11701 1727096142.62564: Evaluated conditional (network_state != {}): False 11701 1727096142.62574: when evaluation is False, skipping this task 11701 1727096142.62582: _execute() done 11701 1727096142.62589: dumping result to json 11701 1727096142.62616: done dumping result, returning 11701 1727096142.62631: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a05c-c957-00000000007f] 11701 1727096142.62640: sending task result for task 0afff68d-5257-a05c-c957-00000000007f 11701 1727096142.62756: done sending task result for task 0afff68d-5257-a05c-c957-00000000007f 11701 1727096142.62764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096142.63010: no more pending results, returning what we have 11701 1727096142.63013: results queue empty 11701 1727096142.63014: checking for any_errors_fatal 11701 1727096142.63019: done checking for any_errors_fatal 11701 1727096142.63019: checking for max_fail_percentage 11701 1727096142.63021: done checking for max_fail_percentage 11701 1727096142.63022: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.63023: done checking to see if all hosts have failed 11701 1727096142.63023: getting the remaining hosts for this loop 11701 1727096142.63024: done getting the remaining hosts for this loop 11701 1727096142.63029: getting the next task for host managed_node3 11701 1727096142.63035: done getting next task for host managed_node3 11701 1727096142.63038: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11701 1727096142.63042: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.63059: getting variables 11701 1727096142.63060: in VariableManager get_vars() 11701 1727096142.63100: Calling all_inventory to load vars for managed_node3 11701 1727096142.63102: Calling groups_inventory to load vars for managed_node3 11701 1727096142.63105: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.63116: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.63127: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.63131: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.64649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.66335: done with get_vars() 11701 1727096142.66365: done getting variables 11701 1727096142.66427: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:42 -0400 (0:00:00.066) 0:00:26.629 ****** 11701 1727096142.66464: entering _queue_task() for managed_node3/fail 11701 1727096142.66838: worker is 1 (out of 1 available) 11701 1727096142.66971: exiting _queue_task() for managed_node3/fail 11701 1727096142.66982: done queuing things up, now waiting for results queue to drain 11701 1727096142.66984: waiting for pending results... 11701 1727096142.67221: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11701 1727096142.67377: in run() - task 0afff68d-5257-a05c-c957-000000000080 11701 1727096142.67474: variable 'ansible_search_path' from source: unknown 11701 1727096142.67477: variable 'ansible_search_path' from source: unknown 11701 1727096142.67481: calling self._execute() 11701 1727096142.67571: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.67583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.67601: variable 'omit' from source: magic vars 11701 1727096142.68018: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.68046: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.68249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096142.70540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096142.70675: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096142.70681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096142.70736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096142.70772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096142.70864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.70903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.70942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.71038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.71041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.71126: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.71259: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11701 1727096142.71299: variable 'ansible_distribution' from source: facts 11701 1727096142.71309: variable '__network_rh_distros' from source: role '' defaults 11701 1727096142.71324: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11701 1727096142.71620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.71690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.71693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.71722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.71737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.71795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.71829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.71857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.71909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.72072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.72075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.72077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.72079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.72081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.72082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.72388: variable 'network_connections' from source: task vars 11701 1727096142.72406: variable 'port2_profile' from source: play vars 11701 1727096142.72491: variable 'port2_profile' from source: play vars 11701 1727096142.72744: variable 'port1_profile' from source: play vars 11701 1727096142.72748: variable 'port1_profile' from source: play vars 11701 1727096142.72750: variable 'controller_profile' from source: play vars 11701 1727096142.72752: variable 'controller_profile' from source: play vars 11701 1727096142.72754: variable 'network_state' from source: role '' defaults 11701 1727096142.72813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096142.73023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096142.73072: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096142.73117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096142.73152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096142.73292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096142.73298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096142.73300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.73358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096142.73404: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11701 1727096142.73676: when evaluation is False, skipping this task 11701 1727096142.73679: _execute() done 11701 1727096142.73682: dumping result to json 11701 1727096142.73684: done dumping result, returning 11701 1727096142.73687: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a05c-c957-000000000080] 11701 1727096142.73690: sending task result for task 0afff68d-5257-a05c-c957-000000000080 11701 1727096142.73761: done sending task result for task 0afff68d-5257-a05c-c957-000000000080 11701 1727096142.73765: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11701 1727096142.73825: no more pending results, returning what we have 11701 1727096142.73831: results queue empty 11701 1727096142.73832: checking for any_errors_fatal 11701 1727096142.73838: done checking for any_errors_fatal 11701 1727096142.73838: checking for max_fail_percentage 11701 1727096142.73840: done checking for max_fail_percentage 11701 1727096142.73841: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.73842: done checking to see if all hosts have failed 11701 1727096142.73843: getting the remaining hosts for this loop 11701 1727096142.73844: done getting the remaining hosts for this loop 11701 1727096142.73848: getting the next task for host managed_node3 11701 1727096142.73856: done getting next task for host managed_node3 11701 1727096142.73860: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11701 1727096142.73864: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.73886: getting variables 11701 1727096142.73889: in VariableManager get_vars() 11701 1727096142.73931: Calling all_inventory to load vars for managed_node3 11701 1727096142.73934: Calling groups_inventory to load vars for managed_node3 11701 1727096142.73937: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.73948: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.73951: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.73955: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.75996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.77823: done with get_vars() 11701 1727096142.77845: done getting variables 11701 1727096142.77905: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:42 -0400 (0:00:00.114) 0:00:26.744 ****** 11701 1727096142.77944: entering _queue_task() for managed_node3/dnf 11701 1727096142.78327: worker is 1 (out of 1 available) 11701 1727096142.78340: exiting _queue_task() for managed_node3/dnf 11701 1727096142.78358: done queuing things up, now waiting for results queue to drain 11701 1727096142.78360: waiting for pending results... 11701 1727096142.78665: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11701 1727096142.78906: in run() - task 0afff68d-5257-a05c-c957-000000000081 11701 1727096142.78910: variable 'ansible_search_path' from source: unknown 11701 1727096142.78914: variable 'ansible_search_path' from source: unknown 11701 1727096142.78920: calling self._execute() 11701 1727096142.79028: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.79042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.79057: variable 'omit' from source: magic vars 11701 1727096142.79584: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.79587: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.80182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096142.83052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096142.83142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096142.83187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096142.83317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096142.83392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096142.83747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.83750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.83986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.84211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.84214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.84458: variable 'ansible_distribution' from source: facts 11701 1727096142.84754: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.84758: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11701 1727096142.85175: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096142.85222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.85252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.85396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.85436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.85454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.85503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.85530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.85634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.85678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.85731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.85771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096142.85851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096142.85960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.86005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096142.86060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096142.86332: variable 'network_connections' from source: task vars 11701 1727096142.86573: variable 'port2_profile' from source: play vars 11701 1727096142.86577: variable 'port2_profile' from source: play vars 11701 1727096142.86580: variable 'port1_profile' from source: play vars 11701 1727096142.86755: variable 'port1_profile' from source: play vars 11701 1727096142.86772: variable 'controller_profile' from source: play vars 11701 1727096142.86841: variable 'controller_profile' from source: play vars 11701 1727096142.87040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096142.87480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096142.87526: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096142.87601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096142.87703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096142.87753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096142.87999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096142.88003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096142.88005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096142.88093: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096142.88759: variable 'network_connections' from source: task vars 11701 1727096142.88762: variable 'port2_profile' from source: play vars 11701 1727096142.88765: variable 'port2_profile' from source: play vars 11701 1727096142.88766: variable 'port1_profile' from source: play vars 11701 1727096142.88890: variable 'port1_profile' from source: play vars 11701 1727096142.88902: variable 'controller_profile' from source: play vars 11701 1727096142.88960: variable 'controller_profile' from source: play vars 11701 1727096142.89104: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096142.89111: when evaluation is False, skipping this task 11701 1727096142.89117: _execute() done 11701 1727096142.89122: dumping result to json 11701 1727096142.89128: done dumping result, returning 11701 1727096142.89138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a05c-c957-000000000081] 11701 1727096142.89145: sending task result for task 0afff68d-5257-a05c-c957-000000000081 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096142.89288: no more pending results, returning what we have 11701 1727096142.89292: results queue empty 11701 1727096142.89293: checking for any_errors_fatal 11701 1727096142.89301: done checking for any_errors_fatal 11701 1727096142.89302: checking for max_fail_percentage 11701 1727096142.89304: done checking for max_fail_percentage 11701 1727096142.89304: checking to see if all hosts have failed and the running result is not ok 11701 1727096142.89305: done checking to see if all hosts have failed 11701 1727096142.89306: getting the remaining hosts for this loop 11701 1727096142.89307: done getting the remaining hosts for this loop 11701 1727096142.89311: getting the next task for host managed_node3 11701 1727096142.89318: done getting next task for host managed_node3 11701 1727096142.89322: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11701 1727096142.89326: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096142.89346: getting variables 11701 1727096142.89348: in VariableManager get_vars() 11701 1727096142.89394: Calling all_inventory to load vars for managed_node3 11701 1727096142.89397: Calling groups_inventory to load vars for managed_node3 11701 1727096142.89400: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096142.89411: Calling all_plugins_play to load vars for managed_node3 11701 1727096142.89414: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096142.89418: Calling groups_plugins_play to load vars for managed_node3 11701 1727096142.90781: done sending task result for task 0afff68d-5257-a05c-c957-000000000081 11701 1727096142.90784: WORKER PROCESS EXITING 11701 1727096142.92402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096142.94093: done with get_vars() 11701 1727096142.94124: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11701 1727096142.94209: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:42 -0400 (0:00:00.163) 0:00:26.907 ****** 11701 1727096142.94258: entering _queue_task() for managed_node3/yum 11701 1727096142.94989: worker is 1 (out of 1 available) 11701 1727096142.95002: exiting _queue_task() for managed_node3/yum 11701 1727096142.95014: done queuing things up, now waiting for results queue to drain 11701 1727096142.95015: waiting for pending results... 11701 1727096142.95280: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11701 1727096142.95780: in run() - task 0afff68d-5257-a05c-c957-000000000082 11701 1727096142.95799: variable 'ansible_search_path' from source: unknown 11701 1727096142.95806: variable 'ansible_search_path' from source: unknown 11701 1727096142.95843: calling self._execute() 11701 1727096142.95940: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096142.95964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096142.96079: variable 'omit' from source: magic vars 11701 1727096142.96829: variable 'ansible_distribution_major_version' from source: facts 11701 1727096142.96848: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096142.97144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096143.02089: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096143.02293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096143.02339: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096143.02410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096143.02515: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096143.02714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.02752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.02838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.02936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.02959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.03187: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.03209: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11701 1727096143.03254: when evaluation is False, skipping this task 11701 1727096143.03263: _execute() done 11701 1727096143.03274: dumping result to json 11701 1727096143.03282: done dumping result, returning 11701 1727096143.03374: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a05c-c957-000000000082] 11701 1727096143.03468: sending task result for task 0afff68d-5257-a05c-c957-000000000082 11701 1727096143.03546: done sending task result for task 0afff68d-5257-a05c-c957-000000000082 11701 1727096143.03550: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11701 1727096143.03622: no more pending results, returning what we have 11701 1727096143.03627: results queue empty 11701 1727096143.03628: checking for any_errors_fatal 11701 1727096143.03635: done checking for any_errors_fatal 11701 1727096143.03636: checking for max_fail_percentage 11701 1727096143.03638: done checking for max_fail_percentage 11701 1727096143.03639: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.03640: done checking to see if all hosts have failed 11701 1727096143.03641: getting the remaining hosts for this loop 11701 1727096143.03642: done getting the remaining hosts for this loop 11701 1727096143.03646: getting the next task for host managed_node3 11701 1727096143.03654: done getting next task for host managed_node3 11701 1727096143.03659: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11701 1727096143.03663: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.03687: getting variables 11701 1727096143.03689: in VariableManager get_vars() 11701 1727096143.03733: Calling all_inventory to load vars for managed_node3 11701 1727096143.03736: Calling groups_inventory to load vars for managed_node3 11701 1727096143.03740: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.03750: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.03754: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.03757: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.07333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.09566: done with get_vars() 11701 1727096143.09598: done getting variables 11701 1727096143.09659: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:43 -0400 (0:00:00.154) 0:00:27.061 ****** 11701 1727096143.09696: entering _queue_task() for managed_node3/fail 11701 1727096143.10030: worker is 1 (out of 1 available) 11701 1727096143.10044: exiting _queue_task() for managed_node3/fail 11701 1727096143.10055: done queuing things up, now waiting for results queue to drain 11701 1727096143.10057: waiting for pending results... 11701 1727096143.10705: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11701 1727096143.10907: in run() - task 0afff68d-5257-a05c-c957-000000000083 11701 1727096143.10929: variable 'ansible_search_path' from source: unknown 11701 1727096143.11375: variable 'ansible_search_path' from source: unknown 11701 1727096143.11379: calling self._execute() 11701 1727096143.11382: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.11384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.11386: variable 'omit' from source: magic vars 11701 1727096143.11909: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.12045: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.12375: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.12842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096143.16132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096143.16220: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096143.16269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096143.16308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096143.16340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096143.16431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.16466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.16505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.16548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.16570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.16642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.16671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.16721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.16781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.16804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.16851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.16881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.16919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.16972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.16975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.17356: variable 'network_connections' from source: task vars 11701 1727096143.17377: variable 'port2_profile' from source: play vars 11701 1727096143.17575: variable 'port2_profile' from source: play vars 11701 1727096143.17580: variable 'port1_profile' from source: play vars 11701 1727096143.17643: variable 'port1_profile' from source: play vars 11701 1727096143.17790: variable 'controller_profile' from source: play vars 11701 1727096143.17850: variable 'controller_profile' from source: play vars 11701 1727096143.18011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096143.18476: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096143.18520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096143.18562: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096143.18601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096143.18659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096143.18712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096143.18743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.18781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096143.18837: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096143.19100: variable 'network_connections' from source: task vars 11701 1727096143.19172: variable 'port2_profile' from source: play vars 11701 1727096143.19175: variable 'port2_profile' from source: play vars 11701 1727096143.19187: variable 'port1_profile' from source: play vars 11701 1727096143.19253: variable 'port1_profile' from source: play vars 11701 1727096143.19268: variable 'controller_profile' from source: play vars 11701 1727096143.19334: variable 'controller_profile' from source: play vars 11701 1727096143.19365: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096143.19385: when evaluation is False, skipping this task 11701 1727096143.19392: _execute() done 11701 1727096143.19399: dumping result to json 11701 1727096143.19405: done dumping result, returning 11701 1727096143.19427: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a05c-c957-000000000083] 11701 1727096143.19431: sending task result for task 0afff68d-5257-a05c-c957-000000000083 11701 1727096143.19611: done sending task result for task 0afff68d-5257-a05c-c957-000000000083 11701 1727096143.19614: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096143.19695: no more pending results, returning what we have 11701 1727096143.19700: results queue empty 11701 1727096143.19701: checking for any_errors_fatal 11701 1727096143.19706: done checking for any_errors_fatal 11701 1727096143.19707: checking for max_fail_percentage 11701 1727096143.19709: done checking for max_fail_percentage 11701 1727096143.19710: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.19711: done checking to see if all hosts have failed 11701 1727096143.19712: getting the remaining hosts for this loop 11701 1727096143.19713: done getting the remaining hosts for this loop 11701 1727096143.19717: getting the next task for host managed_node3 11701 1727096143.19725: done getting next task for host managed_node3 11701 1727096143.19730: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11701 1727096143.19734: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.19754: getting variables 11701 1727096143.19756: in VariableManager get_vars() 11701 1727096143.19802: Calling all_inventory to load vars for managed_node3 11701 1727096143.19805: Calling groups_inventory to load vars for managed_node3 11701 1727096143.19808: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.19819: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.19823: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.19826: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.22160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.24010: done with get_vars() 11701 1727096143.24036: done getting variables 11701 1727096143.24092: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:43 -0400 (0:00:00.144) 0:00:27.205 ****** 11701 1727096143.24127: entering _queue_task() for managed_node3/package 11701 1727096143.24637: worker is 1 (out of 1 available) 11701 1727096143.24648: exiting _queue_task() for managed_node3/package 11701 1727096143.24658: done queuing things up, now waiting for results queue to drain 11701 1727096143.24660: waiting for pending results... 11701 1727096143.24893: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11701 1727096143.25174: in run() - task 0afff68d-5257-a05c-c957-000000000084 11701 1727096143.25179: variable 'ansible_search_path' from source: unknown 11701 1727096143.25182: variable 'ansible_search_path' from source: unknown 11701 1727096143.25185: calling self._execute() 11701 1727096143.25201: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.25212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.25226: variable 'omit' from source: magic vars 11701 1727096143.25610: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.25633: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.25944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096143.26496: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096143.26500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096143.26502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096143.26548: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096143.26667: variable 'network_packages' from source: role '' defaults 11701 1727096143.26785: variable '__network_provider_setup' from source: role '' defaults 11701 1727096143.26802: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096143.26873: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096143.26888: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096143.26957: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096143.27143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096143.29519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096143.29873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096143.29877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096143.29880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096143.29882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096143.29939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.29978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.30013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.30057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.30079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.30131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.30159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.30192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.30239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.30259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.30529: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11701 1727096143.30603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.30655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.30686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.30724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.30742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.30832: variable 'ansible_python' from source: facts 11701 1727096143.30869: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11701 1727096143.30966: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096143.31043: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096143.31291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.31294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.31296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.31298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.31300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.31412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.31451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.31540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.31655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.31676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.31864: variable 'network_connections' from source: task vars 11701 1727096143.31871: variable 'port2_profile' from source: play vars 11701 1727096143.31975: variable 'port2_profile' from source: play vars 11701 1727096143.31990: variable 'port1_profile' from source: play vars 11701 1727096143.32091: variable 'port1_profile' from source: play vars 11701 1727096143.32101: variable 'controller_profile' from source: play vars 11701 1727096143.32333: variable 'controller_profile' from source: play vars 11701 1727096143.32336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096143.32339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096143.32341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.32376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096143.32431: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.32729: variable 'network_connections' from source: task vars 11701 1727096143.32732: variable 'port2_profile' from source: play vars 11701 1727096143.32838: variable 'port2_profile' from source: play vars 11701 1727096143.32847: variable 'port1_profile' from source: play vars 11701 1727096143.32948: variable 'port1_profile' from source: play vars 11701 1727096143.32961: variable 'controller_profile' from source: play vars 11701 1727096143.33065: variable 'controller_profile' from source: play vars 11701 1727096143.33099: variable '__network_packages_default_wireless' from source: role '' defaults 11701 1727096143.33184: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.33502: variable 'network_connections' from source: task vars 11701 1727096143.33506: variable 'port2_profile' from source: play vars 11701 1727096143.33632: variable 'port2_profile' from source: play vars 11701 1727096143.33639: variable 'port1_profile' from source: play vars 11701 1727096143.33642: variable 'port1_profile' from source: play vars 11701 1727096143.33646: variable 'controller_profile' from source: play vars 11701 1727096143.33717: variable 'controller_profile' from source: play vars 11701 1727096143.33741: variable '__network_packages_default_team' from source: role '' defaults 11701 1727096143.33861: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096143.34143: variable 'network_connections' from source: task vars 11701 1727096143.34146: variable 'port2_profile' from source: play vars 11701 1727096143.34211: variable 'port2_profile' from source: play vars 11701 1727096143.34219: variable 'port1_profile' from source: play vars 11701 1727096143.34293: variable 'port1_profile' from source: play vars 11701 1727096143.34297: variable 'controller_profile' from source: play vars 11701 1727096143.34362: variable 'controller_profile' from source: play vars 11701 1727096143.34417: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096143.34485: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096143.34491: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096143.34548: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096143.34772: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11701 1727096143.35234: variable 'network_connections' from source: task vars 11701 1727096143.35237: variable 'port2_profile' from source: play vars 11701 1727096143.35382: variable 'port2_profile' from source: play vars 11701 1727096143.35385: variable 'port1_profile' from source: play vars 11701 1727096143.35387: variable 'port1_profile' from source: play vars 11701 1727096143.35389: variable 'controller_profile' from source: play vars 11701 1727096143.35422: variable 'controller_profile' from source: play vars 11701 1727096143.35435: variable 'ansible_distribution' from source: facts 11701 1727096143.35438: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.35444: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.35460: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11701 1727096143.35627: variable 'ansible_distribution' from source: facts 11701 1727096143.35630: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.35636: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.35654: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11701 1727096143.35819: variable 'ansible_distribution' from source: facts 11701 1727096143.35822: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.35829: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.35873: variable 'network_provider' from source: set_fact 11701 1727096143.35891: variable 'ansible_facts' from source: unknown 11701 1727096143.36588: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11701 1727096143.36592: when evaluation is False, skipping this task 11701 1727096143.36594: _execute() done 11701 1727096143.36597: dumping result to json 11701 1727096143.36599: done dumping result, returning 11701 1727096143.36621: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a05c-c957-000000000084] 11701 1727096143.36624: sending task result for task 0afff68d-5257-a05c-c957-000000000084 11701 1727096143.36750: done sending task result for task 0afff68d-5257-a05c-c957-000000000084 11701 1727096143.36752: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11701 1727096143.36832: no more pending results, returning what we have 11701 1727096143.36836: results queue empty 11701 1727096143.36837: checking for any_errors_fatal 11701 1727096143.36842: done checking for any_errors_fatal 11701 1727096143.36843: checking for max_fail_percentage 11701 1727096143.36845: done checking for max_fail_percentage 11701 1727096143.36846: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.36846: done checking to see if all hosts have failed 11701 1727096143.36847: getting the remaining hosts for this loop 11701 1727096143.36848: done getting the remaining hosts for this loop 11701 1727096143.36857: getting the next task for host managed_node3 11701 1727096143.36863: done getting next task for host managed_node3 11701 1727096143.36870: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11701 1727096143.36873: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.36889: getting variables 11701 1727096143.36890: in VariableManager get_vars() 11701 1727096143.36926: Calling all_inventory to load vars for managed_node3 11701 1727096143.36928: Calling groups_inventory to load vars for managed_node3 11701 1727096143.36930: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.36938: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.36941: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.36943: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.38760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.40446: done with get_vars() 11701 1727096143.40480: done getting variables 11701 1727096143.40534: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:43 -0400 (0:00:00.164) 0:00:27.370 ****** 11701 1727096143.40578: entering _queue_task() for managed_node3/package 11701 1727096143.40947: worker is 1 (out of 1 available) 11701 1727096143.40963: exiting _queue_task() for managed_node3/package 11701 1727096143.40978: done queuing things up, now waiting for results queue to drain 11701 1727096143.40980: waiting for pending results... 11701 1727096143.41413: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11701 1727096143.41509: in run() - task 0afff68d-5257-a05c-c957-000000000085 11701 1727096143.41513: variable 'ansible_search_path' from source: unknown 11701 1727096143.41515: variable 'ansible_search_path' from source: unknown 11701 1727096143.41518: calling self._execute() 11701 1727096143.41596: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.41600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.41618: variable 'omit' from source: magic vars 11701 1727096143.42002: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.42015: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.42129: variable 'network_state' from source: role '' defaults 11701 1727096143.42137: Evaluated conditional (network_state != {}): False 11701 1727096143.42139: when evaluation is False, skipping this task 11701 1727096143.42142: _execute() done 11701 1727096143.42144: dumping result to json 11701 1727096143.42472: done dumping result, returning 11701 1727096143.42476: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a05c-c957-000000000085] 11701 1727096143.42478: sending task result for task 0afff68d-5257-a05c-c957-000000000085 11701 1727096143.42551: done sending task result for task 0afff68d-5257-a05c-c957-000000000085 11701 1727096143.42554: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096143.42629: no more pending results, returning what we have 11701 1727096143.42631: results queue empty 11701 1727096143.42632: checking for any_errors_fatal 11701 1727096143.42637: done checking for any_errors_fatal 11701 1727096143.42637: checking for max_fail_percentage 11701 1727096143.42639: done checking for max_fail_percentage 11701 1727096143.42640: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.42640: done checking to see if all hosts have failed 11701 1727096143.42641: getting the remaining hosts for this loop 11701 1727096143.42642: done getting the remaining hosts for this loop 11701 1727096143.42645: getting the next task for host managed_node3 11701 1727096143.42650: done getting next task for host managed_node3 11701 1727096143.42656: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11701 1727096143.42659: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.42678: getting variables 11701 1727096143.42680: in VariableManager get_vars() 11701 1727096143.42721: Calling all_inventory to load vars for managed_node3 11701 1727096143.42724: Calling groups_inventory to load vars for managed_node3 11701 1727096143.42726: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.42735: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.42738: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.42741: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.44125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.45761: done with get_vars() 11701 1727096143.45790: done getting variables 11701 1727096143.45850: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:43 -0400 (0:00:00.053) 0:00:27.423 ****** 11701 1727096143.45889: entering _queue_task() for managed_node3/package 11701 1727096143.46237: worker is 1 (out of 1 available) 11701 1727096143.46249: exiting _queue_task() for managed_node3/package 11701 1727096143.46264: done queuing things up, now waiting for results queue to drain 11701 1727096143.46265: waiting for pending results... 11701 1727096143.46587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11701 1727096143.46733: in run() - task 0afff68d-5257-a05c-c957-000000000086 11701 1727096143.46775: variable 'ansible_search_path' from source: unknown 11701 1727096143.46778: variable 'ansible_search_path' from source: unknown 11701 1727096143.46789: calling self._execute() 11701 1727096143.46891: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.47073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.47076: variable 'omit' from source: magic vars 11701 1727096143.47315: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.47329: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.47469: variable 'network_state' from source: role '' defaults 11701 1727096143.47479: Evaluated conditional (network_state != {}): False 11701 1727096143.47483: when evaluation is False, skipping this task 11701 1727096143.47485: _execute() done 11701 1727096143.47488: dumping result to json 11701 1727096143.47490: done dumping result, returning 11701 1727096143.47500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a05c-c957-000000000086] 11701 1727096143.47503: sending task result for task 0afff68d-5257-a05c-c957-000000000086 11701 1727096143.47939: done sending task result for task 0afff68d-5257-a05c-c957-000000000086 11701 1727096143.47941: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096143.48011: no more pending results, returning what we have 11701 1727096143.48014: results queue empty 11701 1727096143.48015: checking for any_errors_fatal 11701 1727096143.48020: done checking for any_errors_fatal 11701 1727096143.48021: checking for max_fail_percentage 11701 1727096143.48022: done checking for max_fail_percentage 11701 1727096143.48023: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.48024: done checking to see if all hosts have failed 11701 1727096143.48025: getting the remaining hosts for this loop 11701 1727096143.48026: done getting the remaining hosts for this loop 11701 1727096143.48029: getting the next task for host managed_node3 11701 1727096143.48035: done getting next task for host managed_node3 11701 1727096143.48039: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11701 1727096143.48043: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.48062: getting variables 11701 1727096143.48063: in VariableManager get_vars() 11701 1727096143.48103: Calling all_inventory to load vars for managed_node3 11701 1727096143.48106: Calling groups_inventory to load vars for managed_node3 11701 1727096143.48108: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.48117: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.48120: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.48122: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.49840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.51764: done with get_vars() 11701 1727096143.51796: done getting variables 11701 1727096143.51854: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:43 -0400 (0:00:00.060) 0:00:27.483 ****** 11701 1727096143.51895: entering _queue_task() for managed_node3/service 11701 1727096143.52262: worker is 1 (out of 1 available) 11701 1727096143.52278: exiting _queue_task() for managed_node3/service 11701 1727096143.52290: done queuing things up, now waiting for results queue to drain 11701 1727096143.52291: waiting for pending results... 11701 1727096143.52631: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11701 1727096143.52982: in run() - task 0afff68d-5257-a05c-c957-000000000087 11701 1727096143.52987: variable 'ansible_search_path' from source: unknown 11701 1727096143.52990: variable 'ansible_search_path' from source: unknown 11701 1727096143.52993: calling self._execute() 11701 1727096143.53064: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.53074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.53086: variable 'omit' from source: magic vars 11701 1727096143.53483: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.53496: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.53701: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.53976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096143.57043: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096143.57232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096143.57381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096143.57415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096143.57442: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096143.57625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.57654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.57825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.57871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.57885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.57988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.58009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.58145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.58185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.58197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.58233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.58374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.58395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.58482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.58485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.58832: variable 'network_connections' from source: task vars 11701 1727096143.58845: variable 'port2_profile' from source: play vars 11701 1727096143.58940: variable 'port2_profile' from source: play vars 11701 1727096143.58951: variable 'port1_profile' from source: play vars 11701 1727096143.59131: variable 'port1_profile' from source: play vars 11701 1727096143.59135: variable 'controller_profile' from source: play vars 11701 1727096143.59491: variable 'controller_profile' from source: play vars 11701 1727096143.59494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096143.59976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096143.59987: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096143.60015: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096143.60040: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096143.60100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096143.60118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096143.60139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.60164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096143.60326: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096143.60907: variable 'network_connections' from source: task vars 11701 1727096143.60912: variable 'port2_profile' from source: play vars 11701 1727096143.61076: variable 'port2_profile' from source: play vars 11701 1727096143.61084: variable 'port1_profile' from source: play vars 11701 1727096143.61145: variable 'port1_profile' from source: play vars 11701 1727096143.61153: variable 'controller_profile' from source: play vars 11701 1727096143.61363: variable 'controller_profile' from source: play vars 11701 1727096143.61367: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11701 1727096143.61381: when evaluation is False, skipping this task 11701 1727096143.61384: _execute() done 11701 1727096143.61386: dumping result to json 11701 1727096143.61388: done dumping result, returning 11701 1727096143.61390: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a05c-c957-000000000087] 11701 1727096143.61393: sending task result for task 0afff68d-5257-a05c-c957-000000000087 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11701 1727096143.61535: no more pending results, returning what we have 11701 1727096143.61539: results queue empty 11701 1727096143.61540: checking for any_errors_fatal 11701 1727096143.61547: done checking for any_errors_fatal 11701 1727096143.61548: checking for max_fail_percentage 11701 1727096143.61550: done checking for max_fail_percentage 11701 1727096143.61551: checking to see if all hosts have failed and the running result is not ok 11701 1727096143.61555: done checking to see if all hosts have failed 11701 1727096143.61556: getting the remaining hosts for this loop 11701 1727096143.61557: done getting the remaining hosts for this loop 11701 1727096143.61561: getting the next task for host managed_node3 11701 1727096143.61571: done getting next task for host managed_node3 11701 1727096143.61576: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11701 1727096143.61580: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096143.61602: getting variables 11701 1727096143.61604: in VariableManager get_vars() 11701 1727096143.61649: Calling all_inventory to load vars for managed_node3 11701 1727096143.61654: Calling groups_inventory to load vars for managed_node3 11701 1727096143.61658: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096143.61872: Calling all_plugins_play to load vars for managed_node3 11701 1727096143.61878: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096143.61882: Calling groups_plugins_play to load vars for managed_node3 11701 1727096143.62476: done sending task result for task 0afff68d-5257-a05c-c957-000000000087 11701 1727096143.62481: WORKER PROCESS EXITING 11701 1727096143.64371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096143.66126: done with get_vars() 11701 1727096143.66153: done getting variables 11701 1727096143.66219: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:43 -0400 (0:00:00.143) 0:00:27.627 ****** 11701 1727096143.66258: entering _queue_task() for managed_node3/service 11701 1727096143.66795: worker is 1 (out of 1 available) 11701 1727096143.66805: exiting _queue_task() for managed_node3/service 11701 1727096143.66816: done queuing things up, now waiting for results queue to drain 11701 1727096143.66817: waiting for pending results... 11701 1727096143.67186: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11701 1727096143.67191: in run() - task 0afff68d-5257-a05c-c957-000000000088 11701 1727096143.67193: variable 'ansible_search_path' from source: unknown 11701 1727096143.67196: variable 'ansible_search_path' from source: unknown 11701 1727096143.67198: calling self._execute() 11701 1727096143.67295: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.67298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.67309: variable 'omit' from source: magic vars 11701 1727096143.67908: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.67911: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096143.67915: variable 'network_provider' from source: set_fact 11701 1727096143.67918: variable 'network_state' from source: role '' defaults 11701 1727096143.67921: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11701 1727096143.67924: variable 'omit' from source: magic vars 11701 1727096143.67974: variable 'omit' from source: magic vars 11701 1727096143.68001: variable 'network_service_name' from source: role '' defaults 11701 1727096143.68074: variable 'network_service_name' from source: role '' defaults 11701 1727096143.68189: variable '__network_provider_setup' from source: role '' defaults 11701 1727096143.68194: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096143.68254: variable '__network_service_name_default_nm' from source: role '' defaults 11701 1727096143.68273: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096143.68329: variable '__network_packages_default_nm' from source: role '' defaults 11701 1727096143.68648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096143.73202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096143.73282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096143.73433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096143.73538: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096143.73571: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096143.73973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.73977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.73980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.73982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.74172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.74175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.74178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.74180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.74217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.74231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.74757: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11701 1727096143.74989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.75013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.75040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.75201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.75215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.75425: variable 'ansible_python' from source: facts 11701 1727096143.75449: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11701 1727096143.75647: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096143.75844: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096143.76151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.76182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.76207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.76246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.76384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.76430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096143.76453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096143.76597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.76637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096143.76651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096143.76975: variable 'network_connections' from source: task vars 11701 1727096143.76979: variable 'port2_profile' from source: play vars 11701 1727096143.76987: variable 'port2_profile' from source: play vars 11701 1727096143.77001: variable 'port1_profile' from source: play vars 11701 1727096143.77083: variable 'port1_profile' from source: play vars 11701 1727096143.77094: variable 'controller_profile' from source: play vars 11701 1727096143.77176: variable 'controller_profile' from source: play vars 11701 1727096143.77290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096143.77519: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096143.77580: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096143.77622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096143.77666: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096143.77774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096143.77777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096143.77808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096143.77843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096143.77933: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.78201: variable 'network_connections' from source: task vars 11701 1727096143.78283: variable 'port2_profile' from source: play vars 11701 1727096143.78287: variable 'port2_profile' from source: play vars 11701 1727096143.78374: variable 'port1_profile' from source: play vars 11701 1727096143.78377: variable 'port1_profile' from source: play vars 11701 1727096143.78391: variable 'controller_profile' from source: play vars 11701 1727096143.78464: variable 'controller_profile' from source: play vars 11701 1727096143.78501: variable '__network_packages_default_wireless' from source: role '' defaults 11701 1727096143.78586: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096143.78881: variable 'network_connections' from source: task vars 11701 1727096143.78891: variable 'port2_profile' from source: play vars 11701 1727096143.78951: variable 'port2_profile' from source: play vars 11701 1727096143.78963: variable 'port1_profile' from source: play vars 11701 1727096143.79034: variable 'port1_profile' from source: play vars 11701 1727096143.79070: variable 'controller_profile' from source: play vars 11701 1727096143.79116: variable 'controller_profile' from source: play vars 11701 1727096143.79138: variable '__network_packages_default_team' from source: role '' defaults 11701 1727096143.79223: variable '__network_team_connections_defined' from source: role '' defaults 11701 1727096143.79555: variable 'network_connections' from source: task vars 11701 1727096143.79558: variable 'port2_profile' from source: play vars 11701 1727096143.79872: variable 'port2_profile' from source: play vars 11701 1727096143.79875: variable 'port1_profile' from source: play vars 11701 1727096143.79877: variable 'port1_profile' from source: play vars 11701 1727096143.79879: variable 'controller_profile' from source: play vars 11701 1727096143.79881: variable 'controller_profile' from source: play vars 11701 1727096143.79883: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096143.80228: variable '__network_service_name_default_initscripts' from source: role '' defaults 11701 1727096143.80235: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096143.80300: variable '__network_packages_default_initscripts' from source: role '' defaults 11701 1727096143.80706: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11701 1727096143.81929: variable 'network_connections' from source: task vars 11701 1727096143.81935: variable 'port2_profile' from source: play vars 11701 1727096143.82070: variable 'port2_profile' from source: play vars 11701 1727096143.82230: variable 'port1_profile' from source: play vars 11701 1727096143.82234: variable 'port1_profile' from source: play vars 11701 1727096143.82236: variable 'controller_profile' from source: play vars 11701 1727096143.82312: variable 'controller_profile' from source: play vars 11701 1727096143.82320: variable 'ansible_distribution' from source: facts 11701 1727096143.82323: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.82331: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.82347: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11701 1727096143.82737: variable 'ansible_distribution' from source: facts 11701 1727096143.82740: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.82745: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.82762: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11701 1727096143.83110: variable 'ansible_distribution' from source: facts 11701 1727096143.83278: variable '__network_rh_distros' from source: role '' defaults 11701 1727096143.83285: variable 'ansible_distribution_major_version' from source: facts 11701 1727096143.83323: variable 'network_provider' from source: set_fact 11701 1727096143.83346: variable 'omit' from source: magic vars 11701 1727096143.83383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096143.83526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096143.83542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096143.83563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096143.83606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096143.83636: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096143.83639: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.83642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.83960: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096143.83963: Set connection var ansible_timeout to 10 11701 1727096143.83965: Set connection var ansible_shell_type to sh 11701 1727096143.83969: Set connection var ansible_shell_executable to /bin/sh 11701 1727096143.83971: Set connection var ansible_connection to ssh 11701 1727096143.83973: Set connection var ansible_pipelining to False 11701 1727096143.83983: variable 'ansible_shell_executable' from source: unknown 11701 1727096143.83985: variable 'ansible_connection' from source: unknown 11701 1727096143.83987: variable 'ansible_module_compression' from source: unknown 11701 1727096143.83992: variable 'ansible_shell_type' from source: unknown 11701 1727096143.83994: variable 'ansible_shell_executable' from source: unknown 11701 1727096143.83996: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096143.84001: variable 'ansible_pipelining' from source: unknown 11701 1727096143.84003: variable 'ansible_timeout' from source: unknown 11701 1727096143.84008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096143.84232: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096143.84242: variable 'omit' from source: magic vars 11701 1727096143.84247: starting attempt loop 11701 1727096143.84365: running the handler 11701 1727096143.84444: variable 'ansible_facts' from source: unknown 11701 1727096143.86062: _low_level_execute_command(): starting 11701 1727096143.86066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096143.87791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096143.87895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096143.87903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096143.87931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096143.89622: stdout chunk (state=3): >>>/root <<< 11701 1727096143.89741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096143.89745: stderr chunk (state=3): >>><<< 11701 1727096143.89748: stdout chunk (state=3): >>><<< 11701 1727096143.89988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096143.90001: _low_level_execute_command(): starting 11701 1727096143.90007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376 `" && echo ansible-tmp-1727096143.899885-12998-148751432459376="` echo /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376 `" ) && sleep 0' 11701 1727096143.91235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096143.91243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096143.91253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096143.91289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096143.91301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096143.91474: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096143.91584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096143.91651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096143.93619: stdout chunk (state=3): >>>ansible-tmp-1727096143.899885-12998-148751432459376=/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376 <<< 11701 1727096143.93771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096143.93838: stderr chunk (state=3): >>><<< 11701 1727096143.93841: stdout chunk (state=3): >>><<< 11701 1727096143.93905: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096143.899885-12998-148751432459376=/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096143.93940: variable 'ansible_module_compression' from source: unknown 11701 1727096143.93999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11701 1727096143.94066: variable 'ansible_facts' from source: unknown 11701 1727096143.94708: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py 11701 1727096143.95025: Sending initial data 11701 1727096143.95029: Sent initial data (155 bytes) 11701 1727096143.96475: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096143.96542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096143.96548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096143.96553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096143.96585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096143.98324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096143.98370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096143.98472: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp3r_woyyb /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py <<< 11701 1727096143.98476: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py" <<< 11701 1727096143.98547: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp3r_woyyb" to remote "/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py" <<< 11701 1727096144.01341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.01345: stdout chunk (state=3): >>><<< 11701 1727096144.01351: stderr chunk (state=3): >>><<< 11701 1727096144.01486: done transferring module to remote 11701 1727096144.01490: _low_level_execute_command(): starting 11701 1727096144.01493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/ /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py && sleep 0' 11701 1727096144.02814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.02927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.03392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.03441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.05335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.05345: stdout chunk (state=3): >>><<< 11701 1727096144.05595: stderr chunk (state=3): >>><<< 11701 1727096144.05599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096144.05601: _low_level_execute_command(): starting 11701 1727096144.05604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/AnsiballZ_systemd.py && sleep 0' 11701 1727096144.06681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.06750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096144.06849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.07289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.07462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.37102: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10412032", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323367424", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "697816000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11701 1727096144.37122: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11701 1727096144.39181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096144.39191: stdout chunk (state=3): >>><<< 11701 1727096144.39202: stderr chunk (state=3): >>><<< 11701 1727096144.39366: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10412032", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323367424", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "697816000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096144.39564: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096144.39587: _low_level_execute_command(): starting 11701 1727096144.39590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096143.899885-12998-148751432459376/ > /dev/null 2>&1 && sleep 0' 11701 1727096144.40164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.40175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096144.40186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096144.40199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096144.40212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096144.40220: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096144.40226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096144.40249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096144.40463: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.40798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.40898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.42761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.42765: stdout chunk (state=3): >>><<< 11701 1727096144.42773: stderr chunk (state=3): >>><<< 11701 1727096144.42789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096144.42796: handler run complete 11701 1727096144.42860: attempt loop complete, returning result 11701 1727096144.42863: _execute() done 11701 1727096144.42866: dumping result to json 11701 1727096144.42889: done dumping result, returning 11701 1727096144.42897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a05c-c957-000000000088] 11701 1727096144.42901: sending task result for task 0afff68d-5257-a05c-c957-000000000088 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096144.43398: no more pending results, returning what we have 11701 1727096144.43401: results queue empty 11701 1727096144.43402: checking for any_errors_fatal 11701 1727096144.43407: done checking for any_errors_fatal 11701 1727096144.43408: checking for max_fail_percentage 11701 1727096144.43409: done checking for max_fail_percentage 11701 1727096144.43410: checking to see if all hosts have failed and the running result is not ok 11701 1727096144.43411: done checking to see if all hosts have failed 11701 1727096144.43411: getting the remaining hosts for this loop 11701 1727096144.43413: done getting the remaining hosts for this loop 11701 1727096144.43416: getting the next task for host managed_node3 11701 1727096144.43422: done getting next task for host managed_node3 11701 1727096144.43426: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11701 1727096144.43431: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096144.43442: getting variables 11701 1727096144.43444: in VariableManager get_vars() 11701 1727096144.43499: Calling all_inventory to load vars for managed_node3 11701 1727096144.43502: Calling groups_inventory to load vars for managed_node3 11701 1727096144.43512: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096144.43518: done sending task result for task 0afff68d-5257-a05c-c957-000000000088 11701 1727096144.43521: WORKER PROCESS EXITING 11701 1727096144.43530: Calling all_plugins_play to load vars for managed_node3 11701 1727096144.43533: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096144.43536: Calling groups_plugins_play to load vars for managed_node3 11701 1727096144.45945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096144.48141: done with get_vars() 11701 1727096144.48171: done getting variables 11701 1727096144.48228: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:44 -0400 (0:00:00.820) 0:00:28.447 ****** 11701 1727096144.48269: entering _queue_task() for managed_node3/service 11701 1727096144.48596: worker is 1 (out of 1 available) 11701 1727096144.48607: exiting _queue_task() for managed_node3/service 11701 1727096144.48618: done queuing things up, now waiting for results queue to drain 11701 1727096144.48619: waiting for pending results... 11701 1727096144.48925: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11701 1727096144.49195: in run() - task 0afff68d-5257-a05c-c957-000000000089 11701 1727096144.49198: variable 'ansible_search_path' from source: unknown 11701 1727096144.49201: variable 'ansible_search_path' from source: unknown 11701 1727096144.49203: calling self._execute() 11701 1727096144.49479: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.49483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.49485: variable 'omit' from source: magic vars 11701 1727096144.49766: variable 'ansible_distribution_major_version' from source: facts 11701 1727096144.49989: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096144.50213: variable 'network_provider' from source: set_fact 11701 1727096144.50217: Evaluated conditional (network_provider == "nm"): True 11701 1727096144.50673: variable '__network_wpa_supplicant_required' from source: role '' defaults 11701 1727096144.50676: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11701 1727096144.50839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096144.53289: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096144.53345: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096144.53389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096144.53510: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096144.53513: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096144.53531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096144.53560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096144.53591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096144.53630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096144.53645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096144.53698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096144.53724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096144.53743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096144.53785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096144.53835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096144.53844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096144.53870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096144.53893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096144.53934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096144.53954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096144.54097: variable 'network_connections' from source: task vars 11701 1727096144.54110: variable 'port2_profile' from source: play vars 11701 1727096144.54270: variable 'port2_profile' from source: play vars 11701 1727096144.54274: variable 'port1_profile' from source: play vars 11701 1727096144.54276: variable 'port1_profile' from source: play vars 11701 1727096144.54278: variable 'controller_profile' from source: play vars 11701 1727096144.54308: variable 'controller_profile' from source: play vars 11701 1727096144.54379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11701 1727096144.54556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11701 1727096144.54591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11701 1727096144.54618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11701 1727096144.54673: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11701 1727096144.54687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11701 1727096144.54707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11701 1727096144.54729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096144.54754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11701 1727096144.54830: variable '__network_wireless_connections_defined' from source: role '' defaults 11701 1727096144.55022: variable 'network_connections' from source: task vars 11701 1727096144.55025: variable 'port2_profile' from source: play vars 11701 1727096144.55090: variable 'port2_profile' from source: play vars 11701 1727096144.55159: variable 'port1_profile' from source: play vars 11701 1727096144.55162: variable 'port1_profile' from source: play vars 11701 1727096144.55175: variable 'controller_profile' from source: play vars 11701 1727096144.55227: variable 'controller_profile' from source: play vars 11701 1727096144.55264: Evaluated conditional (__network_wpa_supplicant_required): False 11701 1727096144.55270: when evaluation is False, skipping this task 11701 1727096144.55273: _execute() done 11701 1727096144.55276: dumping result to json 11701 1727096144.55278: done dumping result, returning 11701 1727096144.55280: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a05c-c957-000000000089] 11701 1727096144.55282: sending task result for task 0afff68d-5257-a05c-c957-000000000089 11701 1727096144.55436: done sending task result for task 0afff68d-5257-a05c-c957-000000000089 11701 1727096144.55439: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11701 1727096144.55492: no more pending results, returning what we have 11701 1727096144.55496: results queue empty 11701 1727096144.55497: checking for any_errors_fatal 11701 1727096144.55516: done checking for any_errors_fatal 11701 1727096144.55517: checking for max_fail_percentage 11701 1727096144.55519: done checking for max_fail_percentage 11701 1727096144.55520: checking to see if all hosts have failed and the running result is not ok 11701 1727096144.55526: done checking to see if all hosts have failed 11701 1727096144.55527: getting the remaining hosts for this loop 11701 1727096144.55529: done getting the remaining hosts for this loop 11701 1727096144.55533: getting the next task for host managed_node3 11701 1727096144.55540: done getting next task for host managed_node3 11701 1727096144.55544: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11701 1727096144.55548: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096144.55566: getting variables 11701 1727096144.55570: in VariableManager get_vars() 11701 1727096144.55613: Calling all_inventory to load vars for managed_node3 11701 1727096144.55616: Calling groups_inventory to load vars for managed_node3 11701 1727096144.55619: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096144.55630: Calling all_plugins_play to load vars for managed_node3 11701 1727096144.55634: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096144.55637: Calling groups_plugins_play to load vars for managed_node3 11701 1727096144.57257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096144.58761: done with get_vars() 11701 1727096144.58793: done getting variables 11701 1727096144.58856: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:44 -0400 (0:00:00.106) 0:00:28.553 ****** 11701 1727096144.58892: entering _queue_task() for managed_node3/service 11701 1727096144.59241: worker is 1 (out of 1 available) 11701 1727096144.59254: exiting _queue_task() for managed_node3/service 11701 1727096144.59264: done queuing things up, now waiting for results queue to drain 11701 1727096144.59265: waiting for pending results... 11701 1727096144.59587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11701 1727096144.59740: in run() - task 0afff68d-5257-a05c-c957-00000000008a 11701 1727096144.59744: variable 'ansible_search_path' from source: unknown 11701 1727096144.59746: variable 'ansible_search_path' from source: unknown 11701 1727096144.59765: calling self._execute() 11701 1727096144.59960: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.59964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.59970: variable 'omit' from source: magic vars 11701 1727096144.60237: variable 'ansible_distribution_major_version' from source: facts 11701 1727096144.60254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096144.60359: variable 'network_provider' from source: set_fact 11701 1727096144.60372: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096144.60375: when evaluation is False, skipping this task 11701 1727096144.60378: _execute() done 11701 1727096144.60380: dumping result to json 11701 1727096144.60390: done dumping result, returning 11701 1727096144.60394: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a05c-c957-00000000008a] 11701 1727096144.60396: sending task result for task 0afff68d-5257-a05c-c957-00000000008a 11701 1727096144.60591: done sending task result for task 0afff68d-5257-a05c-c957-00000000008a 11701 1727096144.60595: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11701 1727096144.60637: no more pending results, returning what we have 11701 1727096144.60641: results queue empty 11701 1727096144.60642: checking for any_errors_fatal 11701 1727096144.60648: done checking for any_errors_fatal 11701 1727096144.60649: checking for max_fail_percentage 11701 1727096144.60651: done checking for max_fail_percentage 11701 1727096144.60651: checking to see if all hosts have failed and the running result is not ok 11701 1727096144.60652: done checking to see if all hosts have failed 11701 1727096144.60653: getting the remaining hosts for this loop 11701 1727096144.60654: done getting the remaining hosts for this loop 11701 1727096144.60657: getting the next task for host managed_node3 11701 1727096144.60663: done getting next task for host managed_node3 11701 1727096144.60667: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11701 1727096144.60672: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096144.60690: getting variables 11701 1727096144.60692: in VariableManager get_vars() 11701 1727096144.60731: Calling all_inventory to load vars for managed_node3 11701 1727096144.60733: Calling groups_inventory to load vars for managed_node3 11701 1727096144.60736: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096144.60746: Calling all_plugins_play to load vars for managed_node3 11701 1727096144.60749: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096144.60752: Calling groups_plugins_play to load vars for managed_node3 11701 1727096144.62111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096144.63882: done with get_vars() 11701 1727096144.63924: done getting variables 11701 1727096144.64018: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:44 -0400 (0:00:00.051) 0:00:28.605 ****** 11701 1727096144.64059: entering _queue_task() for managed_node3/copy 11701 1727096144.64470: worker is 1 (out of 1 available) 11701 1727096144.64484: exiting _queue_task() for managed_node3/copy 11701 1727096144.64497: done queuing things up, now waiting for results queue to drain 11701 1727096144.64498: waiting for pending results... 11701 1727096144.64777: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11701 1727096144.64898: in run() - task 0afff68d-5257-a05c-c957-00000000008b 11701 1727096144.64913: variable 'ansible_search_path' from source: unknown 11701 1727096144.64917: variable 'ansible_search_path' from source: unknown 11701 1727096144.64972: calling self._execute() 11701 1727096144.65044: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.65048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.65074: variable 'omit' from source: magic vars 11701 1727096144.65422: variable 'ansible_distribution_major_version' from source: facts 11701 1727096144.65435: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096144.65544: variable 'network_provider' from source: set_fact 11701 1727096144.65548: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096144.65551: when evaluation is False, skipping this task 11701 1727096144.65556: _execute() done 11701 1727096144.65559: dumping result to json 11701 1727096144.65562: done dumping result, returning 11701 1727096144.65566: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a05c-c957-00000000008b] 11701 1727096144.65640: sending task result for task 0afff68d-5257-a05c-c957-00000000008b 11701 1727096144.65707: done sending task result for task 0afff68d-5257-a05c-c957-00000000008b 11701 1727096144.65710: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11701 1727096144.65765: no more pending results, returning what we have 11701 1727096144.65770: results queue empty 11701 1727096144.65771: checking for any_errors_fatal 11701 1727096144.65775: done checking for any_errors_fatal 11701 1727096144.65776: checking for max_fail_percentage 11701 1727096144.65777: done checking for max_fail_percentage 11701 1727096144.65778: checking to see if all hosts have failed and the running result is not ok 11701 1727096144.65779: done checking to see if all hosts have failed 11701 1727096144.65780: getting the remaining hosts for this loop 11701 1727096144.65781: done getting the remaining hosts for this loop 11701 1727096144.65784: getting the next task for host managed_node3 11701 1727096144.65791: done getting next task for host managed_node3 11701 1727096144.65794: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11701 1727096144.65798: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096144.65814: getting variables 11701 1727096144.65815: in VariableManager get_vars() 11701 1727096144.65902: Calling all_inventory to load vars for managed_node3 11701 1727096144.65905: Calling groups_inventory to load vars for managed_node3 11701 1727096144.65909: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096144.65918: Calling all_plugins_play to load vars for managed_node3 11701 1727096144.65921: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096144.65925: Calling groups_plugins_play to load vars for managed_node3 11701 1727096144.67138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096144.68841: done with get_vars() 11701 1727096144.68888: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:44 -0400 (0:00:00.049) 0:00:28.654 ****** 11701 1727096144.68981: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11701 1727096144.69494: worker is 1 (out of 1 available) 11701 1727096144.69505: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11701 1727096144.69515: done queuing things up, now waiting for results queue to drain 11701 1727096144.69516: waiting for pending results... 11701 1727096144.69808: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11701 1727096144.69813: in run() - task 0afff68d-5257-a05c-c957-00000000008c 11701 1727096144.69816: variable 'ansible_search_path' from source: unknown 11701 1727096144.69818: variable 'ansible_search_path' from source: unknown 11701 1727096144.69821: calling self._execute() 11701 1727096144.70124: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.70129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.70133: variable 'omit' from source: magic vars 11701 1727096144.70332: variable 'ansible_distribution_major_version' from source: facts 11701 1727096144.70404: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096144.70411: variable 'omit' from source: magic vars 11701 1727096144.70489: variable 'omit' from source: magic vars 11701 1727096144.70663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11701 1727096144.72823: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11701 1727096144.72877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11701 1727096144.72918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11701 1727096144.72955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11701 1727096144.72979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11701 1727096144.73061: variable 'network_provider' from source: set_fact 11701 1727096144.73196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11701 1727096144.73243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11701 1727096144.73264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11701 1727096144.73354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11701 1727096144.73358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11701 1727096144.73396: variable 'omit' from source: magic vars 11701 1727096144.73572: variable 'omit' from source: magic vars 11701 1727096144.73615: variable 'network_connections' from source: task vars 11701 1727096144.73626: variable 'port2_profile' from source: play vars 11701 1727096144.73689: variable 'port2_profile' from source: play vars 11701 1727096144.73697: variable 'port1_profile' from source: play vars 11701 1727096144.73756: variable 'port1_profile' from source: play vars 11701 1727096144.73760: variable 'controller_profile' from source: play vars 11701 1727096144.73822: variable 'controller_profile' from source: play vars 11701 1727096144.73974: variable 'omit' from source: magic vars 11701 1727096144.73985: variable '__lsr_ansible_managed' from source: task vars 11701 1727096144.74040: variable '__lsr_ansible_managed' from source: task vars 11701 1727096144.74217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11701 1727096144.74727: Loaded config def from plugin (lookup/template) 11701 1727096144.74731: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11701 1727096144.74763: File lookup term: get_ansible_managed.j2 11701 1727096144.74766: variable 'ansible_search_path' from source: unknown 11701 1727096144.74774: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11701 1727096144.74787: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11701 1727096144.74801: variable 'ansible_search_path' from source: unknown 11701 1727096144.82226: variable 'ansible_managed' from source: unknown 11701 1727096144.82406: variable 'omit' from source: magic vars 11701 1727096144.82795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096144.82820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096144.82891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096144.82895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096144.82898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096144.83023: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096144.83026: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.83029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.83394: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096144.83397: Set connection var ansible_timeout to 10 11701 1727096144.83400: Set connection var ansible_shell_type to sh 11701 1727096144.83402: Set connection var ansible_shell_executable to /bin/sh 11701 1727096144.83404: Set connection var ansible_connection to ssh 11701 1727096144.83406: Set connection var ansible_pipelining to False 11701 1727096144.83408: variable 'ansible_shell_executable' from source: unknown 11701 1727096144.83410: variable 'ansible_connection' from source: unknown 11701 1727096144.83412: variable 'ansible_module_compression' from source: unknown 11701 1727096144.83413: variable 'ansible_shell_type' from source: unknown 11701 1727096144.83415: variable 'ansible_shell_executable' from source: unknown 11701 1727096144.83417: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096144.83419: variable 'ansible_pipelining' from source: unknown 11701 1727096144.83421: variable 'ansible_timeout' from source: unknown 11701 1727096144.83432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096144.83435: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096144.83437: variable 'omit' from source: magic vars 11701 1727096144.83439: starting attempt loop 11701 1727096144.83441: running the handler 11701 1727096144.83443: _low_level_execute_command(): starting 11701 1727096144.83445: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096144.84586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096144.84632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096144.84648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.84675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.84835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.86512: stdout chunk (state=3): >>>/root <<< 11701 1727096144.86680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.86684: stderr chunk (state=3): >>><<< 11701 1727096144.86686: stdout chunk (state=3): >>><<< 11701 1727096144.86739: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096144.86821: _low_level_execute_command(): starting 11701 1727096144.86825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107 `" && echo ansible-tmp-1727096144.867613-13047-77960869513107="` echo /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107 `" ) && sleep 0' 11701 1727096144.87944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.87955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096144.88034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096144.88114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096144.88119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.88155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.88162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.90205: stdout chunk (state=3): >>>ansible-tmp-1727096144.867613-13047-77960869513107=/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107 <<< 11701 1727096144.90352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.90374: stderr chunk (state=3): >>><<< 11701 1727096144.90382: stdout chunk (state=3): >>><<< 11701 1727096144.90672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096144.867613-13047-77960869513107=/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096144.90676: variable 'ansible_module_compression' from source: unknown 11701 1727096144.90679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11701 1727096144.90790: variable 'ansible_facts' from source: unknown 11701 1727096144.90929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py 11701 1727096144.91240: Sending initial data 11701 1727096144.91243: Sent initial data (166 bytes) 11701 1727096144.91787: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096144.91824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096144.91837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.91845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.91916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.93560: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096144.93689: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096144.93839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096144.93843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpwjrfw8ms /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py <<< 11701 1727096144.93845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py" <<< 11701 1727096144.93884: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpwjrfw8ms" to remote "/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py" <<< 11701 1727096144.94875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.94906: stderr chunk (state=3): >>><<< 11701 1727096144.94973: stdout chunk (state=3): >>><<< 11701 1727096144.94977: done transferring module to remote 11701 1727096144.94979: _low_level_execute_command(): starting 11701 1727096144.94981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/ /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py && sleep 0' 11701 1727096144.95721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.95784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096144.95847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096144.95864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.95892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.95961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096144.98076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096144.98080: stdout chunk (state=3): >>><<< 11701 1727096144.98082: stderr chunk (state=3): >>><<< 11701 1727096144.98113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096144.98195: _low_level_execute_command(): starting 11701 1727096144.98198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/AnsiballZ_network_connections.py && sleep 0' 11701 1727096144.98743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096144.98758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096144.98890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096144.98905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096144.98922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096144.99007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096145.55024: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/cf85fda0-9a30-4802-92fc-47e5937048b2: error=unknown <<< 11701 1727096145.57078: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/1356eebb-22d1-4dd0-adba-d2a9505d1fb4: error=unknown <<< 11701 1727096145.58700: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/857d53ff-7175-4f1a-9313-51a779b02f5c: error=unknown <<< 11701 1727096145.58972: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11701 1727096145.61099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096145.61306: stderr chunk (state=3): >>><<< 11701 1727096145.61310: stdout chunk (state=3): >>><<< 11701 1727096145.61313: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/cf85fda0-9a30-4802-92fc-47e5937048b2: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/1356eebb-22d1-4dd0-adba-d2a9505d1fb4: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_qt1_xce5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/857d53ff-7175-4f1a-9313-51a779b02f5c: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096145.61315: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096145.61321: _low_level_execute_command(): starting 11701 1727096145.61323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096144.867613-13047-77960869513107/ > /dev/null 2>&1 && sleep 0' 11701 1727096145.62335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096145.62362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096145.62376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096145.62390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096145.62477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096145.62507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096145.62583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096145.64675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096145.64679: stderr chunk (state=3): >>><<< 11701 1727096145.64681: stdout chunk (state=3): >>><<< 11701 1727096145.64687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096145.64694: handler run complete 11701 1727096145.64725: attempt loop complete, returning result 11701 1727096145.64728: _execute() done 11701 1727096145.64730: dumping result to json 11701 1727096145.64737: done dumping result, returning 11701 1727096145.64747: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a05c-c957-00000000008c] 11701 1727096145.64749: sending task result for task 0afff68d-5257-a05c-c957-00000000008c 11701 1727096145.64890: done sending task result for task 0afff68d-5257-a05c-c957-00000000008c 11701 1727096145.64893: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11701 1727096145.65103: no more pending results, returning what we have 11701 1727096145.65114: results queue empty 11701 1727096145.65115: checking for any_errors_fatal 11701 1727096145.65121: done checking for any_errors_fatal 11701 1727096145.65122: checking for max_fail_percentage 11701 1727096145.65123: done checking for max_fail_percentage 11701 1727096145.65124: checking to see if all hosts have failed and the running result is not ok 11701 1727096145.65125: done checking to see if all hosts have failed 11701 1727096145.65126: getting the remaining hosts for this loop 11701 1727096145.65127: done getting the remaining hosts for this loop 11701 1727096145.65131: getting the next task for host managed_node3 11701 1727096145.65139: done getting next task for host managed_node3 11701 1727096145.65143: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11701 1727096145.65147: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096145.65159: getting variables 11701 1727096145.65161: in VariableManager get_vars() 11701 1727096145.65400: Calling all_inventory to load vars for managed_node3 11701 1727096145.65403: Calling groups_inventory to load vars for managed_node3 11701 1727096145.65405: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096145.65414: Calling all_plugins_play to load vars for managed_node3 11701 1727096145.65417: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096145.65419: Calling groups_plugins_play to load vars for managed_node3 11701 1727096145.66963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096145.68513: done with get_vars() 11701 1727096145.68544: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:45 -0400 (0:00:00.996) 0:00:29.650 ****** 11701 1727096145.68640: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11701 1727096145.69017: worker is 1 (out of 1 available) 11701 1727096145.69030: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11701 1727096145.69043: done queuing things up, now waiting for results queue to drain 11701 1727096145.69044: waiting for pending results... 11701 1727096145.69346: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11701 1727096145.69521: in run() - task 0afff68d-5257-a05c-c957-00000000008d 11701 1727096145.69673: variable 'ansible_search_path' from source: unknown 11701 1727096145.69677: variable 'ansible_search_path' from source: unknown 11701 1727096145.69680: calling self._execute() 11701 1727096145.69687: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.69697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.69709: variable 'omit' from source: magic vars 11701 1727096145.70096: variable 'ansible_distribution_major_version' from source: facts 11701 1727096145.70116: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096145.70341: variable 'network_state' from source: role '' defaults 11701 1727096145.70345: Evaluated conditional (network_state != {}): False 11701 1727096145.70348: when evaluation is False, skipping this task 11701 1727096145.70355: _execute() done 11701 1727096145.70357: dumping result to json 11701 1727096145.70359: done dumping result, returning 11701 1727096145.70362: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a05c-c957-00000000008d] 11701 1727096145.70364: sending task result for task 0afff68d-5257-a05c-c957-00000000008d 11701 1727096145.70435: done sending task result for task 0afff68d-5257-a05c-c957-00000000008d 11701 1727096145.70439: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11701 1727096145.70501: no more pending results, returning what we have 11701 1727096145.70506: results queue empty 11701 1727096145.70507: checking for any_errors_fatal 11701 1727096145.70517: done checking for any_errors_fatal 11701 1727096145.70517: checking for max_fail_percentage 11701 1727096145.70519: done checking for max_fail_percentage 11701 1727096145.70520: checking to see if all hosts have failed and the running result is not ok 11701 1727096145.70521: done checking to see if all hosts have failed 11701 1727096145.70522: getting the remaining hosts for this loop 11701 1727096145.70523: done getting the remaining hosts for this loop 11701 1727096145.70526: getting the next task for host managed_node3 11701 1727096145.70533: done getting next task for host managed_node3 11701 1727096145.70537: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11701 1727096145.70541: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096145.70563: getting variables 11701 1727096145.70565: in VariableManager get_vars() 11701 1727096145.70607: Calling all_inventory to load vars for managed_node3 11701 1727096145.70609: Calling groups_inventory to load vars for managed_node3 11701 1727096145.70612: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096145.70623: Calling all_plugins_play to load vars for managed_node3 11701 1727096145.70626: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096145.70629: Calling groups_plugins_play to load vars for managed_node3 11701 1727096145.72290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096145.73826: done with get_vars() 11701 1727096145.73863: done getting variables 11701 1727096145.73927: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:45 -0400 (0:00:00.053) 0:00:29.704 ****** 11701 1727096145.73970: entering _queue_task() for managed_node3/debug 11701 1727096145.74342: worker is 1 (out of 1 available) 11701 1727096145.74357: exiting _queue_task() for managed_node3/debug 11701 1727096145.74571: done queuing things up, now waiting for results queue to drain 11701 1727096145.74573: waiting for pending results... 11701 1727096145.74786: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11701 1727096145.74848: in run() - task 0afff68d-5257-a05c-c957-00000000008e 11701 1727096145.74875: variable 'ansible_search_path' from source: unknown 11701 1727096145.74884: variable 'ansible_search_path' from source: unknown 11701 1727096145.74930: calling self._execute() 11701 1727096145.75035: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.75045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.75060: variable 'omit' from source: magic vars 11701 1727096145.75437: variable 'ansible_distribution_major_version' from source: facts 11701 1727096145.75462: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096145.75476: variable 'omit' from source: magic vars 11701 1727096145.75561: variable 'omit' from source: magic vars 11701 1727096145.75596: variable 'omit' from source: magic vars 11701 1727096145.75671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096145.75687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096145.75714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096145.75737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096145.75756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096145.75872: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096145.75875: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.75879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.75921: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096145.75931: Set connection var ansible_timeout to 10 11701 1727096145.75938: Set connection var ansible_shell_type to sh 11701 1727096145.75947: Set connection var ansible_shell_executable to /bin/sh 11701 1727096145.75993: Set connection var ansible_connection to ssh 11701 1727096145.75996: Set connection var ansible_pipelining to False 11701 1727096145.76000: variable 'ansible_shell_executable' from source: unknown 11701 1727096145.76008: variable 'ansible_connection' from source: unknown 11701 1727096145.76015: variable 'ansible_module_compression' from source: unknown 11701 1727096145.76022: variable 'ansible_shell_type' from source: unknown 11701 1727096145.76028: variable 'ansible_shell_executable' from source: unknown 11701 1727096145.76034: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.76042: variable 'ansible_pipelining' from source: unknown 11701 1727096145.76049: variable 'ansible_timeout' from source: unknown 11701 1727096145.76102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.76218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096145.76234: variable 'omit' from source: magic vars 11701 1727096145.76243: starting attempt loop 11701 1727096145.76248: running the handler 11701 1727096145.76385: variable '__network_connections_result' from source: set_fact 11701 1727096145.76455: handler run complete 11701 1727096145.76483: attempt loop complete, returning result 11701 1727096145.76538: _execute() done 11701 1727096145.76542: dumping result to json 11701 1727096145.76545: done dumping result, returning 11701 1727096145.76547: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a05c-c957-00000000008e] 11701 1727096145.76550: sending task result for task 0afff68d-5257-a05c-c957-00000000008e ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 11701 1727096145.76721: no more pending results, returning what we have 11701 1727096145.76725: results queue empty 11701 1727096145.76727: checking for any_errors_fatal 11701 1727096145.76734: done checking for any_errors_fatal 11701 1727096145.76735: checking for max_fail_percentage 11701 1727096145.76737: done checking for max_fail_percentage 11701 1727096145.76738: checking to see if all hosts have failed and the running result is not ok 11701 1727096145.76739: done checking to see if all hosts have failed 11701 1727096145.76740: getting the remaining hosts for this loop 11701 1727096145.76744: done getting the remaining hosts for this loop 11701 1727096145.76748: getting the next task for host managed_node3 11701 1727096145.76758: done getting next task for host managed_node3 11701 1727096145.76764: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11701 1727096145.76771: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096145.76784: getting variables 11701 1727096145.76787: in VariableManager get_vars() 11701 1727096145.76832: Calling all_inventory to load vars for managed_node3 11701 1727096145.76835: Calling groups_inventory to load vars for managed_node3 11701 1727096145.76837: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096145.76848: Calling all_plugins_play to load vars for managed_node3 11701 1727096145.76854: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096145.76858: Calling groups_plugins_play to load vars for managed_node3 11701 1727096145.77610: done sending task result for task 0afff68d-5257-a05c-c957-00000000008e 11701 1727096145.77614: WORKER PROCESS EXITING 11701 1727096145.78745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096145.80287: done with get_vars() 11701 1727096145.80319: done getting variables 11701 1727096145.80384: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:45 -0400 (0:00:00.064) 0:00:29.768 ****** 11701 1727096145.80422: entering _queue_task() for managed_node3/debug 11701 1727096145.80793: worker is 1 (out of 1 available) 11701 1727096145.80805: exiting _queue_task() for managed_node3/debug 11701 1727096145.80816: done queuing things up, now waiting for results queue to drain 11701 1727096145.80817: waiting for pending results... 11701 1727096145.81127: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11701 1727096145.81289: in run() - task 0afff68d-5257-a05c-c957-00000000008f 11701 1727096145.81314: variable 'ansible_search_path' from source: unknown 11701 1727096145.81321: variable 'ansible_search_path' from source: unknown 11701 1727096145.81369: calling self._execute() 11701 1727096145.81513: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.81517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.81519: variable 'omit' from source: magic vars 11701 1727096145.81900: variable 'ansible_distribution_major_version' from source: facts 11701 1727096145.81920: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096145.81933: variable 'omit' from source: magic vars 11701 1727096145.82007: variable 'omit' from source: magic vars 11701 1727096145.82083: variable 'omit' from source: magic vars 11701 1727096145.82130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096145.82174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096145.82198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096145.82221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096145.82241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096145.82288: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096145.82472: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.82475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.82478: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096145.82480: Set connection var ansible_timeout to 10 11701 1727096145.82482: Set connection var ansible_shell_type to sh 11701 1727096145.82484: Set connection var ansible_shell_executable to /bin/sh 11701 1727096145.82486: Set connection var ansible_connection to ssh 11701 1727096145.82488: Set connection var ansible_pipelining to False 11701 1727096145.82490: variable 'ansible_shell_executable' from source: unknown 11701 1727096145.82492: variable 'ansible_connection' from source: unknown 11701 1727096145.82494: variable 'ansible_module_compression' from source: unknown 11701 1727096145.82503: variable 'ansible_shell_type' from source: unknown 11701 1727096145.82510: variable 'ansible_shell_executable' from source: unknown 11701 1727096145.82517: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.82524: variable 'ansible_pipelining' from source: unknown 11701 1727096145.82530: variable 'ansible_timeout' from source: unknown 11701 1727096145.82537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.82701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096145.82726: variable 'omit' from source: magic vars 11701 1727096145.82736: starting attempt loop 11701 1727096145.82743: running the handler 11701 1727096145.82805: variable '__network_connections_result' from source: set_fact 11701 1727096145.82907: variable '__network_connections_result' from source: set_fact 11701 1727096145.83049: handler run complete 11701 1727096145.83087: attempt loop complete, returning result 11701 1727096145.83095: _execute() done 11701 1727096145.83103: dumping result to json 11701 1727096145.83111: done dumping result, returning 11701 1727096145.83126: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a05c-c957-00000000008f] 11701 1727096145.83135: sending task result for task 0afff68d-5257-a05c-c957-00000000008f 11701 1727096145.83339: done sending task result for task 0afff68d-5257-a05c-c957-00000000008f 11701 1727096145.83343: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11701 1727096145.83469: no more pending results, returning what we have 11701 1727096145.83473: results queue empty 11701 1727096145.83475: checking for any_errors_fatal 11701 1727096145.83482: done checking for any_errors_fatal 11701 1727096145.83483: checking for max_fail_percentage 11701 1727096145.83485: done checking for max_fail_percentage 11701 1727096145.83486: checking to see if all hosts have failed and the running result is not ok 11701 1727096145.83487: done checking to see if all hosts have failed 11701 1727096145.83488: getting the remaining hosts for this loop 11701 1727096145.83490: done getting the remaining hosts for this loop 11701 1727096145.83493: getting the next task for host managed_node3 11701 1727096145.83501: done getting next task for host managed_node3 11701 1727096145.83505: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11701 1727096145.83509: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096145.83521: getting variables 11701 1727096145.83523: in VariableManager get_vars() 11701 1727096145.83747: Calling all_inventory to load vars for managed_node3 11701 1727096145.83756: Calling groups_inventory to load vars for managed_node3 11701 1727096145.83759: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096145.83771: Calling all_plugins_play to load vars for managed_node3 11701 1727096145.83781: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096145.83784: Calling groups_plugins_play to load vars for managed_node3 11701 1727096145.85539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096145.87113: done with get_vars() 11701 1727096145.87145: done getting variables 11701 1727096145.87211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:45 -0400 (0:00:00.068) 0:00:29.837 ****** 11701 1727096145.87250: entering _queue_task() for managed_node3/debug 11701 1727096145.87630: worker is 1 (out of 1 available) 11701 1727096145.87642: exiting _queue_task() for managed_node3/debug 11701 1727096145.87656: done queuing things up, now waiting for results queue to drain 11701 1727096145.87657: waiting for pending results... 11701 1727096145.88585: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11701 1727096145.88666: in run() - task 0afff68d-5257-a05c-c957-000000000090 11701 1727096145.88771: variable 'ansible_search_path' from source: unknown 11701 1727096145.88827: variable 'ansible_search_path' from source: unknown 11701 1727096145.88889: calling self._execute() 11701 1727096145.89106: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096145.89374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096145.89378: variable 'omit' from source: magic vars 11701 1727096145.90101: variable 'ansible_distribution_major_version' from source: facts 11701 1727096145.90180: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096145.90442: variable 'network_state' from source: role '' defaults 11701 1727096145.90474: Evaluated conditional (network_state != {}): False 11701 1727096145.90503: when evaluation is False, skipping this task 11701 1727096145.90512: _execute() done 11701 1727096145.90539: dumping result to json 11701 1727096145.90547: done dumping result, returning 11701 1727096145.90583: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a05c-c957-000000000090] 11701 1727096145.90606: sending task result for task 0afff68d-5257-a05c-c957-000000000090 11701 1727096145.90974: done sending task result for task 0afff68d-5257-a05c-c957-000000000090 11701 1727096145.90977: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11701 1727096145.91034: no more pending results, returning what we have 11701 1727096145.91039: results queue empty 11701 1727096145.91040: checking for any_errors_fatal 11701 1727096145.91051: done checking for any_errors_fatal 11701 1727096145.91054: checking for max_fail_percentage 11701 1727096145.91057: done checking for max_fail_percentage 11701 1727096145.91058: checking to see if all hosts have failed and the running result is not ok 11701 1727096145.91059: done checking to see if all hosts have failed 11701 1727096145.91060: getting the remaining hosts for this loop 11701 1727096145.91061: done getting the remaining hosts for this loop 11701 1727096145.91065: getting the next task for host managed_node3 11701 1727096145.91076: done getting next task for host managed_node3 11701 1727096145.91080: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11701 1727096145.91085: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096145.91106: getting variables 11701 1727096145.91108: in VariableManager get_vars() 11701 1727096145.91157: Calling all_inventory to load vars for managed_node3 11701 1727096145.91160: Calling groups_inventory to load vars for managed_node3 11701 1727096145.91163: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096145.91376: Calling all_plugins_play to load vars for managed_node3 11701 1727096145.91380: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096145.91384: Calling groups_plugins_play to load vars for managed_node3 11701 1727096145.99605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096146.01260: done with get_vars() 11701 1727096146.01290: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:46 -0400 (0:00:00.141) 0:00:29.978 ****** 11701 1727096146.01378: entering _queue_task() for managed_node3/ping 11701 1727096146.01740: worker is 1 (out of 1 available) 11701 1727096146.01757: exiting _queue_task() for managed_node3/ping 11701 1727096146.01770: done queuing things up, now waiting for results queue to drain 11701 1727096146.01772: waiting for pending results... 11701 1727096146.02031: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11701 1727096146.02388: in run() - task 0afff68d-5257-a05c-c957-000000000091 11701 1727096146.02416: variable 'ansible_search_path' from source: unknown 11701 1727096146.02424: variable 'ansible_search_path' from source: unknown 11701 1727096146.02475: calling self._execute() 11701 1727096146.02583: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.02607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.02632: variable 'omit' from source: magic vars 11701 1727096146.03043: variable 'ansible_distribution_major_version' from source: facts 11701 1727096146.03070: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096146.03082: variable 'omit' from source: magic vars 11701 1727096146.03157: variable 'omit' from source: magic vars 11701 1727096146.03207: variable 'omit' from source: magic vars 11701 1727096146.03274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096146.03373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096146.03378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096146.03381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.03383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.03422: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096146.03433: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.03501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.03578: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096146.03605: Set connection var ansible_timeout to 10 11701 1727096146.03773: Set connection var ansible_shell_type to sh 11701 1727096146.03776: Set connection var ansible_shell_executable to /bin/sh 11701 1727096146.03778: Set connection var ansible_connection to ssh 11701 1727096146.03780: Set connection var ansible_pipelining to False 11701 1727096146.03782: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.03784: variable 'ansible_connection' from source: unknown 11701 1727096146.03786: variable 'ansible_module_compression' from source: unknown 11701 1727096146.03787: variable 'ansible_shell_type' from source: unknown 11701 1727096146.03790: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.03792: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.03794: variable 'ansible_pipelining' from source: unknown 11701 1727096146.03796: variable 'ansible_timeout' from source: unknown 11701 1727096146.03798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.03985: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11701 1727096146.04001: variable 'omit' from source: magic vars 11701 1727096146.04012: starting attempt loop 11701 1727096146.04024: running the handler 11701 1727096146.04437: _low_level_execute_command(): starting 11701 1727096146.04441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096146.05294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.05346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.05371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.05423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.05460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.07280: stdout chunk (state=3): >>>/root <<< 11701 1727096146.07371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.07385: stdout chunk (state=3): >>><<< 11701 1727096146.07400: stderr chunk (state=3): >>><<< 11701 1727096146.07429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.07450: _low_level_execute_command(): starting 11701 1727096146.07477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116 `" && echo ansible-tmp-1727096146.0743706-13117-209211459576116="` echo /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116 `" ) && sleep 0' 11701 1727096146.08051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.08072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.08088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.08107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096146.08123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096146.08136: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096146.08149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.08184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096146.08199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096146.08211: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096146.08286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.08317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.08335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.08360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.08567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.10547: stdout chunk (state=3): >>>ansible-tmp-1727096146.0743706-13117-209211459576116=/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116 <<< 11701 1727096146.10689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.10693: stdout chunk (state=3): >>><<< 11701 1727096146.10700: stderr chunk (state=3): >>><<< 11701 1727096146.10719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096146.0743706-13117-209211459576116=/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.10766: variable 'ansible_module_compression' from source: unknown 11701 1727096146.10807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11701 1727096146.10846: variable 'ansible_facts' from source: unknown 11701 1727096146.10930: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py 11701 1727096146.11115: Sending initial data 11701 1727096146.11119: Sent initial data (153 bytes) 11701 1727096146.11735: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.11739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.11759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.11872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.11891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.11951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.13593: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096146.13635: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096146.13696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpa1u9z76b /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py <<< 11701 1727096146.13700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py" <<< 11701 1727096146.13731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpa1u9z76b" to remote "/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py" <<< 11701 1727096146.14415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.14523: stderr chunk (state=3): >>><<< 11701 1727096146.14527: stdout chunk (state=3): >>><<< 11701 1727096146.14533: done transferring module to remote 11701 1727096146.14550: _low_level_execute_command(): starting 11701 1727096146.14560: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/ /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py && sleep 0' 11701 1727096146.15273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.15294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.15329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.15343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.15361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.15425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.17333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.17337: stdout chunk (state=3): >>><<< 11701 1727096146.17339: stderr chunk (state=3): >>><<< 11701 1727096146.17495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.17498: _low_level_execute_command(): starting 11701 1727096146.17501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/AnsiballZ_ping.py && sleep 0' 11701 1727096146.18088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.18097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.18108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.18130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096146.18148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096146.18158: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096146.18171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.18186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096146.18194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096146.18250: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.18293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.18306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.18325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.18413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.33806: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11701 1727096146.35281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096146.35285: stdout chunk (state=3): >>><<< 11701 1727096146.35287: stderr chunk (state=3): >>><<< 11701 1727096146.35306: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096146.35370: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096146.35374: _low_level_execute_command(): starting 11701 1727096146.35377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096146.0743706-13117-209211459576116/ > /dev/null 2>&1 && sleep 0' 11701 1727096146.36277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.36280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.36282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.36284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096146.36285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096146.36287: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096146.36289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.36290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096146.36292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096146.36294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096146.36296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.36297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.36304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096146.36305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096146.36307: stderr chunk (state=3): >>>debug2: match found <<< 11701 1727096146.36309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.36310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.36312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.36369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.36407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.38470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.38474: stdout chunk (state=3): >>><<< 11701 1727096146.38482: stderr chunk (state=3): >>><<< 11701 1727096146.38504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.38516: handler run complete 11701 1727096146.38597: attempt loop complete, returning result 11701 1727096146.38601: _execute() done 11701 1727096146.38603: dumping result to json 11701 1727096146.38605: done dumping result, returning 11701 1727096146.38608: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a05c-c957-000000000091] 11701 1727096146.38610: sending task result for task 0afff68d-5257-a05c-c957-000000000091 ok: [managed_node3] => { "changed": false, "ping": "pong" } 11701 1727096146.38768: no more pending results, returning what we have 11701 1727096146.38773: results queue empty 11701 1727096146.38774: checking for any_errors_fatal 11701 1727096146.38782: done checking for any_errors_fatal 11701 1727096146.38783: checking for max_fail_percentage 11701 1727096146.38785: done checking for max_fail_percentage 11701 1727096146.38785: checking to see if all hosts have failed and the running result is not ok 11701 1727096146.38787: done checking to see if all hosts have failed 11701 1727096146.38787: getting the remaining hosts for this loop 11701 1727096146.38789: done getting the remaining hosts for this loop 11701 1727096146.38792: getting the next task for host managed_node3 11701 1727096146.38803: done getting next task for host managed_node3 11701 1727096146.38812: ^ task is: TASK: meta (role_complete) 11701 1727096146.38816: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096146.38831: getting variables 11701 1727096146.38833: in VariableManager get_vars() 11701 1727096146.38939: Calling all_inventory to load vars for managed_node3 11701 1727096146.38942: Calling groups_inventory to load vars for managed_node3 11701 1727096146.38945: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096146.38960: Calling all_plugins_play to load vars for managed_node3 11701 1727096146.38964: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096146.39031: Calling groups_plugins_play to load vars for managed_node3 11701 1727096146.39783: done sending task result for task 0afff68d-5257-a05c-c957-000000000091 11701 1727096146.39787: WORKER PROCESS EXITING 11701 1727096146.40655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096146.42946: done with get_vars() 11701 1727096146.43189: done getting variables 11701 1727096146.43379: done queuing things up, now waiting for results queue to drain 11701 1727096146.43382: results queue empty 11701 1727096146.43382: checking for any_errors_fatal 11701 1727096146.43385: done checking for any_errors_fatal 11701 1727096146.43386: checking for max_fail_percentage 11701 1727096146.43391: done checking for max_fail_percentage 11701 1727096146.43392: checking to see if all hosts have failed and the running result is not ok 11701 1727096146.43393: done checking to see if all hosts have failed 11701 1727096146.43393: getting the remaining hosts for this loop 11701 1727096146.43394: done getting the remaining hosts for this loop 11701 1727096146.43397: getting the next task for host managed_node3 11701 1727096146.43402: done getting next task for host managed_node3 11701 1727096146.43404: ^ task is: TASK: Delete the device '{{ controller_device }}' 11701 1727096146.43407: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096146.43410: getting variables 11701 1727096146.43411: in VariableManager get_vars() 11701 1727096146.43443: Calling all_inventory to load vars for managed_node3 11701 1727096146.43447: Calling groups_inventory to load vars for managed_node3 11701 1727096146.43449: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096146.43457: Calling all_plugins_play to load vars for managed_node3 11701 1727096146.43460: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096146.43463: Calling groups_plugins_play to load vars for managed_node3 11701 1727096146.44601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096146.47097: done with get_vars() 11701 1727096146.47121: done getting variables 11701 1727096146.47170: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11701 1727096146.47304: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Monday 23 September 2024 08:55:46 -0400 (0:00:00.459) 0:00:30.437 ****** 11701 1727096146.47335: entering _queue_task() for managed_node3/command 11701 1727096146.47696: worker is 1 (out of 1 available) 11701 1727096146.47710: exiting _queue_task() for managed_node3/command 11701 1727096146.47721: done queuing things up, now waiting for results queue to drain 11701 1727096146.47722: waiting for pending results... 11701 1727096146.48011: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 11701 1727096146.48140: in run() - task 0afff68d-5257-a05c-c957-0000000000c1 11701 1727096146.48166: variable 'ansible_search_path' from source: unknown 11701 1727096146.48215: calling self._execute() 11701 1727096146.48331: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.48343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.48360: variable 'omit' from source: magic vars 11701 1727096146.48736: variable 'ansible_distribution_major_version' from source: facts 11701 1727096146.48756: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096146.48769: variable 'omit' from source: magic vars 11701 1727096146.48793: variable 'omit' from source: magic vars 11701 1727096146.48900: variable 'controller_device' from source: play vars 11701 1727096146.48924: variable 'omit' from source: magic vars 11701 1727096146.48978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096146.49017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096146.49043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096146.49072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.49090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.49124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096146.49173: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.49176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.49261: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096146.49277: Set connection var ansible_timeout to 10 11701 1727096146.49287: Set connection var ansible_shell_type to sh 11701 1727096146.49297: Set connection var ansible_shell_executable to /bin/sh 11701 1727096146.49304: Set connection var ansible_connection to ssh 11701 1727096146.49372: Set connection var ansible_pipelining to False 11701 1727096146.49375: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.49377: variable 'ansible_connection' from source: unknown 11701 1727096146.49380: variable 'ansible_module_compression' from source: unknown 11701 1727096146.49382: variable 'ansible_shell_type' from source: unknown 11701 1727096146.49384: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.49388: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.49392: variable 'ansible_pipelining' from source: unknown 11701 1727096146.49394: variable 'ansible_timeout' from source: unknown 11701 1727096146.49396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.49533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096146.49555: variable 'omit' from source: magic vars 11701 1727096146.49566: starting attempt loop 11701 1727096146.49576: running the handler 11701 1727096146.49595: _low_level_execute_command(): starting 11701 1727096146.49720: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096146.50361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.50483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.50510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.50581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.52273: stdout chunk (state=3): >>>/root <<< 11701 1727096146.52392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.52430: stderr chunk (state=3): >>><<< 11701 1727096146.52456: stdout chunk (state=3): >>><<< 11701 1727096146.52487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.52513: _low_level_execute_command(): starting 11701 1727096146.52525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527 `" && echo ansible-tmp-1727096146.5249977-13150-132529309237527="` echo /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527 `" ) && sleep 0' 11701 1727096146.53165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.53190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.53212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.53324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.53353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.53434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.55402: stdout chunk (state=3): >>>ansible-tmp-1727096146.5249977-13150-132529309237527=/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527 <<< 11701 1727096146.55576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.55580: stdout chunk (state=3): >>><<< 11701 1727096146.55583: stderr chunk (state=3): >>><<< 11701 1727096146.55686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096146.5249977-13150-132529309237527=/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.55690: variable 'ansible_module_compression' from source: unknown 11701 1727096146.55701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096146.55745: variable 'ansible_facts' from source: unknown 11701 1727096146.55841: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py 11701 1727096146.56034: Sending initial data 11701 1727096146.56037: Sent initial data (156 bytes) 11701 1727096146.56674: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.56707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.56728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.56741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.56799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.58508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096146.58551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096146.58616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpv1ckwr1l /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py <<< 11701 1727096146.58626: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py" <<< 11701 1727096146.58661: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpv1ckwr1l" to remote "/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py" <<< 11701 1727096146.59486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.59523: stdout chunk (state=3): >>><<< 11701 1727096146.59527: stderr chunk (state=3): >>><<< 11701 1727096146.59547: done transferring module to remote 11701 1727096146.59632: _low_level_execute_command(): starting 11701 1727096146.59637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/ /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py && sleep 0' 11701 1727096146.60261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.60294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.60314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.60407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.60443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.60465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.60489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.60559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.62386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.62465: stderr chunk (state=3): >>><<< 11701 1727096146.62471: stdout chunk (state=3): >>><<< 11701 1727096146.62473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.62479: _low_level_execute_command(): starting 11701 1727096146.62482: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/AnsiballZ_command.py && sleep 0' 11701 1727096146.63223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.63243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.63265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.63289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096146.63308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096146.63355: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.63421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.63442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.63474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.63560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.79829: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:55:46.787500", "end": "2024-09-23 08:55:46.795326", "delta": "0:00:00.007826", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096146.81461: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.152 closed. <<< 11701 1727096146.81465: stdout chunk (state=3): >>><<< 11701 1727096146.81470: stderr chunk (state=3): >>><<< 11701 1727096146.81472: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:55:46.787500", "end": "2024-09-23 08:55:46.795326", "delta": "0:00:00.007826", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.152 closed. 11701 1727096146.81491: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096146.81509: _low_level_execute_command(): starting 11701 1727096146.81512: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096146.5249977-13150-132529309237527/ > /dev/null 2>&1 && sleep 0' 11701 1727096146.82277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.82280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.82283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.82286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.82340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.82362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.82383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.82487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.84400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.84405: stdout chunk (state=3): >>><<< 11701 1727096146.84501: stderr chunk (state=3): >>><<< 11701 1727096146.84505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.84512: handler run complete 11701 1727096146.84514: Evaluated conditional (False): False 11701 1727096146.84517: Evaluated conditional (False): False 11701 1727096146.84519: attempt loop complete, returning result 11701 1727096146.84520: _execute() done 11701 1727096146.84522: dumping result to json 11701 1727096146.84524: done dumping result, returning 11701 1727096146.84530: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0afff68d-5257-a05c-c957-0000000000c1] 11701 1727096146.84532: sending task result for task 0afff68d-5257-a05c-c957-0000000000c1 11701 1727096146.84675: done sending task result for task 0afff68d-5257-a05c-c957-0000000000c1 11701 1727096146.84679: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007826", "end": "2024-09-23 08:55:46.795326", "failed_when_result": false, "rc": 1, "start": "2024-09-23 08:55:46.787500" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11701 1727096146.84749: no more pending results, returning what we have 11701 1727096146.84753: results queue empty 11701 1727096146.84755: checking for any_errors_fatal 11701 1727096146.84756: done checking for any_errors_fatal 11701 1727096146.84757: checking for max_fail_percentage 11701 1727096146.84759: done checking for max_fail_percentage 11701 1727096146.84760: checking to see if all hosts have failed and the running result is not ok 11701 1727096146.84761: done checking to see if all hosts have failed 11701 1727096146.84762: getting the remaining hosts for this loop 11701 1727096146.84764: done getting the remaining hosts for this loop 11701 1727096146.84870: getting the next task for host managed_node3 11701 1727096146.84879: done getting next task for host managed_node3 11701 1727096146.84881: ^ task is: TASK: Remove test interfaces 11701 1727096146.84884: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096146.84888: getting variables 11701 1727096146.84889: in VariableManager get_vars() 11701 1727096146.84924: Calling all_inventory to load vars for managed_node3 11701 1727096146.84927: Calling groups_inventory to load vars for managed_node3 11701 1727096146.84929: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096146.84939: Calling all_plugins_play to load vars for managed_node3 11701 1727096146.84941: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096146.84944: Calling groups_plugins_play to load vars for managed_node3 11701 1727096146.86549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096146.88054: done with get_vars() 11701 1727096146.88084: done getting variables 11701 1727096146.88140: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:46 -0400 (0:00:00.408) 0:00:30.846 ****** 11701 1727096146.88176: entering _queue_task() for managed_node3/shell 11701 1727096146.88539: worker is 1 (out of 1 available) 11701 1727096146.88553: exiting _queue_task() for managed_node3/shell 11701 1727096146.88564: done queuing things up, now waiting for results queue to drain 11701 1727096146.88565: waiting for pending results... 11701 1727096146.88882: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 11701 1727096146.89005: in run() - task 0afff68d-5257-a05c-c957-0000000000c5 11701 1727096146.89020: variable 'ansible_search_path' from source: unknown 11701 1727096146.89024: variable 'ansible_search_path' from source: unknown 11701 1727096146.89061: calling self._execute() 11701 1727096146.89203: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.89206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.89210: variable 'omit' from source: magic vars 11701 1727096146.89673: variable 'ansible_distribution_major_version' from source: facts 11701 1727096146.89677: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096146.89680: variable 'omit' from source: magic vars 11701 1727096146.89682: variable 'omit' from source: magic vars 11701 1727096146.89793: variable 'dhcp_interface1' from source: play vars 11701 1727096146.89798: variable 'dhcp_interface2' from source: play vars 11701 1727096146.89825: variable 'omit' from source: magic vars 11701 1727096146.89992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096146.89996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096146.89999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096146.90002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.90004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096146.90006: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096146.90009: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.90011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.90089: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096146.90092: Set connection var ansible_timeout to 10 11701 1727096146.90096: Set connection var ansible_shell_type to sh 11701 1727096146.90099: Set connection var ansible_shell_executable to /bin/sh 11701 1727096146.90101: Set connection var ansible_connection to ssh 11701 1727096146.90106: Set connection var ansible_pipelining to False 11701 1727096146.90128: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.90138: variable 'ansible_connection' from source: unknown 11701 1727096146.90141: variable 'ansible_module_compression' from source: unknown 11701 1727096146.90143: variable 'ansible_shell_type' from source: unknown 11701 1727096146.90146: variable 'ansible_shell_executable' from source: unknown 11701 1727096146.90148: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096146.90150: variable 'ansible_pipelining' from source: unknown 11701 1727096146.90155: variable 'ansible_timeout' from source: unknown 11701 1727096146.90158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096146.90291: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096146.90302: variable 'omit' from source: magic vars 11701 1727096146.90308: starting attempt loop 11701 1727096146.90312: running the handler 11701 1727096146.90486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096146.90491: _low_level_execute_command(): starting 11701 1727096146.90493: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096146.91056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.91083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096146.91086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096146.91128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096146.91214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.91247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.91289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.92991: stdout chunk (state=3): >>>/root <<< 11701 1727096146.93144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.93148: stdout chunk (state=3): >>><<< 11701 1727096146.93150: stderr chunk (state=3): >>><<< 11701 1727096146.93171: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.93273: _low_level_execute_command(): starting 11701 1727096146.93278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991 `" && echo ansible-tmp-1727096146.9318306-13174-128406042279991="` echo /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991 `" ) && sleep 0' 11701 1727096146.93881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.93940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.93948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.93996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.95981: stdout chunk (state=3): >>>ansible-tmp-1727096146.9318306-13174-128406042279991=/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991 <<< 11701 1727096146.96082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096146.96161: stderr chunk (state=3): >>><<< 11701 1727096146.96173: stdout chunk (state=3): >>><<< 11701 1727096146.96374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096146.9318306-13174-128406042279991=/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096146.96378: variable 'ansible_module_compression' from source: unknown 11701 1727096146.96380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096146.96383: variable 'ansible_facts' from source: unknown 11701 1727096146.96443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py 11701 1727096146.96629: Sending initial data 11701 1727096146.96639: Sent initial data (156 bytes) 11701 1727096146.97268: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096146.97359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096146.97375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096146.97388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096146.97450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096146.99208: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096146.99212: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11701 1727096146.99215: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096146.99217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096146.99259: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpcm4p6_5l /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py <<< 11701 1727096146.99262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py" <<< 11701 1727096146.99299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpcm4p6_5l" to remote "/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py" <<< 11701 1727096147.00038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.00086: stderr chunk (state=3): >>><<< 11701 1727096147.00130: stdout chunk (state=3): >>><<< 11701 1727096147.00134: done transferring module to remote 11701 1727096147.00150: _low_level_execute_command(): starting 11701 1727096147.00161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/ /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py && sleep 0' 11701 1727096147.00954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.01029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096147.01214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096147.01217: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.01314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.01412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.03573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.03577: stdout chunk (state=3): >>><<< 11701 1727096147.03584: stderr chunk (state=3): >>><<< 11701 1727096147.03684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.03688: _low_level_execute_command(): starting 11701 1727096147.03692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/AnsiballZ_command.py && sleep 0' 11701 1727096147.04676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.04680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 11701 1727096147.04682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.04685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096147.04687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.04701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.04736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.24500: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:55:47.203502", "end": "2024-09-23 08:55:47.241847", "delta": "0:00:00.038345", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096147.26483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096147.26487: stdout chunk (state=3): >>><<< 11701 1727096147.26492: stderr chunk (state=3): >>><<< 11701 1727096147.26518: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:55:47.203502", "end": "2024-09-23 08:55:47.241847", "delta": "0:00:00.038345", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096147.26565: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096147.26779: _low_level_execute_command(): starting 11701 1727096147.26782: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096146.9318306-13174-128406042279991/ > /dev/null 2>&1 && sleep 0' 11701 1727096147.27517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.27528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096147.27801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.29607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.29611: stderr chunk (state=3): >>><<< 11701 1727096147.29614: stdout chunk (state=3): >>><<< 11701 1727096147.29632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.29659: handler run complete 11701 1727096147.29662: Evaluated conditional (False): False 11701 1727096147.29676: attempt loop complete, returning result 11701 1727096147.29679: _execute() done 11701 1727096147.29681: dumping result to json 11701 1727096147.29766: done dumping result, returning 11701 1727096147.29774: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0afff68d-5257-a05c-c957-0000000000c5] 11701 1727096147.29776: sending task result for task 0afff68d-5257-a05c-c957-0000000000c5 11701 1727096147.29853: done sending task result for task 0afff68d-5257-a05c-c957-0000000000c5 11701 1727096147.29856: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.038345", "end": "2024-09-23 08:55:47.241847", "rc": 0, "start": "2024-09-23 08:55:47.203502" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11701 1727096147.29919: no more pending results, returning what we have 11701 1727096147.29923: results queue empty 11701 1727096147.29924: checking for any_errors_fatal 11701 1727096147.29935: done checking for any_errors_fatal 11701 1727096147.29936: checking for max_fail_percentage 11701 1727096147.29938: done checking for max_fail_percentage 11701 1727096147.29938: checking to see if all hosts have failed and the running result is not ok 11701 1727096147.29939: done checking to see if all hosts have failed 11701 1727096147.29940: getting the remaining hosts for this loop 11701 1727096147.29941: done getting the remaining hosts for this loop 11701 1727096147.29944: getting the next task for host managed_node3 11701 1727096147.29951: done getting next task for host managed_node3 11701 1727096147.29953: ^ task is: TASK: Stop dnsmasq/radvd services 11701 1727096147.29957: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096147.29962: getting variables 11701 1727096147.29964: in VariableManager get_vars() 11701 1727096147.30013: Calling all_inventory to load vars for managed_node3 11701 1727096147.30016: Calling groups_inventory to load vars for managed_node3 11701 1727096147.30019: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096147.30031: Calling all_plugins_play to load vars for managed_node3 11701 1727096147.30034: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096147.30038: Calling groups_plugins_play to load vars for managed_node3 11701 1727096147.32085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096147.33594: done with get_vars() 11701 1727096147.33627: done getting variables 11701 1727096147.33693: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Monday 23 September 2024 08:55:47 -0400 (0:00:00.455) 0:00:31.301 ****** 11701 1727096147.33726: entering _queue_task() for managed_node3/shell 11701 1727096147.34299: worker is 1 (out of 1 available) 11701 1727096147.34310: exiting _queue_task() for managed_node3/shell 11701 1727096147.34319: done queuing things up, now waiting for results queue to drain 11701 1727096147.34320: waiting for pending results... 11701 1727096147.34587: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 11701 1727096147.34598: in run() - task 0afff68d-5257-a05c-c957-0000000000c6 11701 1727096147.34621: variable 'ansible_search_path' from source: unknown 11701 1727096147.34630: variable 'ansible_search_path' from source: unknown 11701 1727096147.34681: calling self._execute() 11701 1727096147.34789: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096147.34808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096147.34825: variable 'omit' from source: magic vars 11701 1727096147.35234: variable 'ansible_distribution_major_version' from source: facts 11701 1727096147.35257: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096147.35343: variable 'omit' from source: magic vars 11701 1727096147.35347: variable 'omit' from source: magic vars 11701 1727096147.35381: variable 'omit' from source: magic vars 11701 1727096147.35428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096147.35478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096147.35506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096147.35529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096147.35547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096147.35593: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096147.35602: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096147.35611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096147.35786: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096147.35789: Set connection var ansible_timeout to 10 11701 1727096147.35792: Set connection var ansible_shell_type to sh 11701 1727096147.35794: Set connection var ansible_shell_executable to /bin/sh 11701 1727096147.35796: Set connection var ansible_connection to ssh 11701 1727096147.35798: Set connection var ansible_pipelining to False 11701 1727096147.35811: variable 'ansible_shell_executable' from source: unknown 11701 1727096147.35819: variable 'ansible_connection' from source: unknown 11701 1727096147.35826: variable 'ansible_module_compression' from source: unknown 11701 1727096147.35834: variable 'ansible_shell_type' from source: unknown 11701 1727096147.35841: variable 'ansible_shell_executable' from source: unknown 11701 1727096147.35847: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096147.35858: variable 'ansible_pipelining' from source: unknown 11701 1727096147.35866: variable 'ansible_timeout' from source: unknown 11701 1727096147.35884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096147.36037: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096147.36057: variable 'omit' from source: magic vars 11701 1727096147.36100: starting attempt loop 11701 1727096147.36103: running the handler 11701 1727096147.36106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096147.36118: _low_level_execute_command(): starting 11701 1727096147.36131: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096147.36988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.37040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.37082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.38758: stdout chunk (state=3): >>>/root <<< 11701 1727096147.38910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.38913: stdout chunk (state=3): >>><<< 11701 1727096147.38916: stderr chunk (state=3): >>><<< 11701 1727096147.39038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.39041: _low_level_execute_command(): starting 11701 1727096147.39045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585 `" && echo ansible-tmp-1727096147.3893993-13206-147348400031585="` echo /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585 `" ) && sleep 0' 11701 1727096147.39597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096147.39617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.39633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096147.39657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096147.39694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096147.39785: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.39811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096147.39834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.39862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.39939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.41941: stdout chunk (state=3): >>>ansible-tmp-1727096147.3893993-13206-147348400031585=/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585 <<< 11701 1727096147.42141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.42192: stderr chunk (state=3): >>><<< 11701 1727096147.42195: stdout chunk (state=3): >>><<< 11701 1727096147.42249: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096147.3893993-13206-147348400031585=/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.42254: variable 'ansible_module_compression' from source: unknown 11701 1727096147.42363: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096147.42366: variable 'ansible_facts' from source: unknown 11701 1727096147.42466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py 11701 1727096147.42806: Sending initial data 11701 1727096147.42809: Sent initial data (156 bytes) 11701 1727096147.43587: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.43610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096147.43626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.43649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.43715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.45460: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096147.45485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096147.45555: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp8e_j9t03 /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py <<< 11701 1727096147.45558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py" <<< 11701 1727096147.45604: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmp8e_j9t03" to remote "/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py" <<< 11701 1727096147.46595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.46674: stdout chunk (state=3): >>><<< 11701 1727096147.46677: stderr chunk (state=3): >>><<< 11701 1727096147.46680: done transferring module to remote 11701 1727096147.46682: _low_level_execute_command(): starting 11701 1727096147.46685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/ /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py && sleep 0' 11701 1727096147.47297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096147.47387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.47426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096147.47447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.47472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.47537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.49903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.49907: stdout chunk (state=3): >>><<< 11701 1727096147.49909: stderr chunk (state=3): >>><<< 11701 1727096147.49912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.49914: _low_level_execute_command(): starting 11701 1727096147.49917: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/AnsiballZ_command.py && sleep 0' 11701 1727096147.50484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096147.50499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.50511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096147.50529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096147.50546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096147.50562: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096147.50586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.50606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11701 1727096147.50619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 11701 1727096147.50633: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11701 1727096147.50645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.50663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096147.50683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096147.50696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096147.50776: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.50796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.50863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.69221: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:55:47.661947", "end": "2024-09-23 08:55:47.688445", "delta": "0:00:00.026498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096147.70812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.70882: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 11701 1727096147.70896: stdout chunk (state=3): >>><<< 11701 1727096147.70915: stderr chunk (state=3): >>><<< 11701 1727096147.70942: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:55:47.661947", "end": "2024-09-23 08:55:47.688445", "delta": "0:00:00.026498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096147.70998: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096147.71019: _low_level_execute_command(): starting 11701 1727096147.71036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096147.3893993-13206-147348400031585/ > /dev/null 2>&1 && sleep 0' 11701 1727096147.71681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096147.71698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096147.71725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096147.71744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096147.71763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096147.71839: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096147.71882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096147.71900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096147.71918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096147.71987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096147.73898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096147.73939: stderr chunk (state=3): >>><<< 11701 1727096147.73943: stdout chunk (state=3): >>><<< 11701 1727096147.74073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096147.74077: handler run complete 11701 1727096147.74079: Evaluated conditional (False): False 11701 1727096147.74081: attempt loop complete, returning result 11701 1727096147.74083: _execute() done 11701 1727096147.74085: dumping result to json 11701 1727096147.74087: done dumping result, returning 11701 1727096147.74089: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0afff68d-5257-a05c-c957-0000000000c6] 11701 1727096147.74091: sending task result for task 0afff68d-5257-a05c-c957-0000000000c6 11701 1727096147.74154: done sending task result for task 0afff68d-5257-a05c-c957-0000000000c6 11701 1727096147.74157: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026498", "end": "2024-09-23 08:55:47.688445", "rc": 0, "start": "2024-09-23 08:55:47.661947" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11701 1727096147.74440: no more pending results, returning what we have 11701 1727096147.74443: results queue empty 11701 1727096147.74444: checking for any_errors_fatal 11701 1727096147.74451: done checking for any_errors_fatal 11701 1727096147.74454: checking for max_fail_percentage 11701 1727096147.74456: done checking for max_fail_percentage 11701 1727096147.74457: checking to see if all hosts have failed and the running result is not ok 11701 1727096147.74457: done checking to see if all hosts have failed 11701 1727096147.74458: getting the remaining hosts for this loop 11701 1727096147.74459: done getting the remaining hosts for this loop 11701 1727096147.74462: getting the next task for host managed_node3 11701 1727096147.74472: done getting next task for host managed_node3 11701 1727096147.74475: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11701 1727096147.74478: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096147.74482: getting variables 11701 1727096147.74484: in VariableManager get_vars() 11701 1727096147.74524: Calling all_inventory to load vars for managed_node3 11701 1727096147.74527: Calling groups_inventory to load vars for managed_node3 11701 1727096147.74530: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096147.74545: Calling all_plugins_play to load vars for managed_node3 11701 1727096147.74548: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096147.74554: Calling groups_plugins_play to load vars for managed_node3 11701 1727096147.76485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096147.78101: done with get_vars() 11701 1727096147.78130: done getting variables 11701 1727096147.78297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Monday 23 September 2024 08:55:47 -0400 (0:00:00.446) 0:00:31.747 ****** 11701 1727096147.78330: entering _queue_task() for managed_node3/command 11701 1727096147.79260: worker is 1 (out of 1 available) 11701 1727096147.79274: exiting _queue_task() for managed_node3/command 11701 1727096147.79285: done queuing things up, now waiting for results queue to drain 11701 1727096147.79286: waiting for pending results... 11701 1727096147.79895: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 11701 1727096147.80086: in run() - task 0afff68d-5257-a05c-c957-0000000000c7 11701 1727096147.80101: variable 'ansible_search_path' from source: unknown 11701 1727096147.80266: calling self._execute() 11701 1727096147.80539: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096147.80546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096147.80772: variable 'omit' from source: magic vars 11701 1727096147.81376: variable 'ansible_distribution_major_version' from source: facts 11701 1727096147.81392: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096147.81510: variable 'network_provider' from source: set_fact 11701 1727096147.81515: Evaluated conditional (network_provider == "initscripts"): False 11701 1727096147.81519: when evaluation is False, skipping this task 11701 1727096147.81521: _execute() done 11701 1727096147.81524: dumping result to json 11701 1727096147.81526: done dumping result, returning 11701 1727096147.81539: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [0afff68d-5257-a05c-c957-0000000000c7] 11701 1727096147.81549: sending task result for task 0afff68d-5257-a05c-c957-0000000000c7 11701 1727096147.81649: done sending task result for task 0afff68d-5257-a05c-c957-0000000000c7 11701 1727096147.81653: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11701 1727096147.81707: no more pending results, returning what we have 11701 1727096147.81712: results queue empty 11701 1727096147.81713: checking for any_errors_fatal 11701 1727096147.81726: done checking for any_errors_fatal 11701 1727096147.81727: checking for max_fail_percentage 11701 1727096147.81729: done checking for max_fail_percentage 11701 1727096147.81730: checking to see if all hosts have failed and the running result is not ok 11701 1727096147.81731: done checking to see if all hosts have failed 11701 1727096147.81732: getting the remaining hosts for this loop 11701 1727096147.81733: done getting the remaining hosts for this loop 11701 1727096147.81738: getting the next task for host managed_node3 11701 1727096147.81749: done getting next task for host managed_node3 11701 1727096147.81872: ^ task is: TASK: Verify network state restored to default 11701 1727096147.81877: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096147.81882: getting variables 11701 1727096147.81884: in VariableManager get_vars() 11701 1727096147.81933: Calling all_inventory to load vars for managed_node3 11701 1727096147.81936: Calling groups_inventory to load vars for managed_node3 11701 1727096147.81939: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096147.81951: Calling all_plugins_play to load vars for managed_node3 11701 1727096147.81958: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096147.81961: Calling groups_plugins_play to load vars for managed_node3 11701 1727096147.83661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096147.85537: done with get_vars() 11701 1727096147.85576: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Monday 23 September 2024 08:55:47 -0400 (0:00:00.073) 0:00:31.821 ****** 11701 1727096147.85679: entering _queue_task() for managed_node3/include_tasks 11701 1727096147.86210: worker is 1 (out of 1 available) 11701 1727096147.86223: exiting _queue_task() for managed_node3/include_tasks 11701 1727096147.86234: done queuing things up, now waiting for results queue to drain 11701 1727096147.86236: waiting for pending results... 11701 1727096147.86789: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 11701 1727096147.86795: in run() - task 0afff68d-5257-a05c-c957-0000000000c8 11701 1727096147.86798: variable 'ansible_search_path' from source: unknown 11701 1727096147.86802: calling self._execute() 11701 1727096147.86804: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096147.86807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096147.86810: variable 'omit' from source: magic vars 11701 1727096147.87224: variable 'ansible_distribution_major_version' from source: facts 11701 1727096147.87228: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096147.87231: _execute() done 11701 1727096147.87233: dumping result to json 11701 1727096147.87235: done dumping result, returning 11701 1727096147.87238: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0afff68d-5257-a05c-c957-0000000000c8] 11701 1727096147.87240: sending task result for task 0afff68d-5257-a05c-c957-0000000000c8 11701 1727096147.87321: done sending task result for task 0afff68d-5257-a05c-c957-0000000000c8 11701 1727096147.87324: WORKER PROCESS EXITING 11701 1727096147.87355: no more pending results, returning what we have 11701 1727096147.87362: in VariableManager get_vars() 11701 1727096147.87419: Calling all_inventory to load vars for managed_node3 11701 1727096147.87422: Calling groups_inventory to load vars for managed_node3 11701 1727096147.87425: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096147.87440: Calling all_plugins_play to load vars for managed_node3 11701 1727096147.87443: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096147.87446: Calling groups_plugins_play to load vars for managed_node3 11701 1727096147.90354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096147.92068: done with get_vars() 11701 1727096147.92095: variable 'ansible_search_path' from source: unknown 11701 1727096147.92166: we have included files to process 11701 1727096147.92169: generating all_blocks data 11701 1727096147.92172: done generating all_blocks data 11701 1727096147.92177: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11701 1727096147.92178: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11701 1727096147.92181: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11701 1727096147.92640: done processing included file 11701 1727096147.92642: iterating over new_blocks loaded from include file 11701 1727096147.92643: in VariableManager get_vars() 11701 1727096147.92663: done with get_vars() 11701 1727096147.92664: filtering new block on tags 11701 1727096147.92706: done filtering new block on tags 11701 1727096147.92708: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 11701 1727096147.92714: extending task lists for all hosts with included blocks 11701 1727096147.94030: done extending task lists 11701 1727096147.94032: done processing included files 11701 1727096147.94033: results queue empty 11701 1727096147.94033: checking for any_errors_fatal 11701 1727096147.94037: done checking for any_errors_fatal 11701 1727096147.94038: checking for max_fail_percentage 11701 1727096147.94039: done checking for max_fail_percentage 11701 1727096147.94039: checking to see if all hosts have failed and the running result is not ok 11701 1727096147.94040: done checking to see if all hosts have failed 11701 1727096147.94041: getting the remaining hosts for this loop 11701 1727096147.94042: done getting the remaining hosts for this loop 11701 1727096147.94045: getting the next task for host managed_node3 11701 1727096147.94049: done getting next task for host managed_node3 11701 1727096147.94054: ^ task is: TASK: Check routes and DNS 11701 1727096147.94058: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096147.94061: getting variables 11701 1727096147.94062: in VariableManager get_vars() 11701 1727096147.94079: Calling all_inventory to load vars for managed_node3 11701 1727096147.94082: Calling groups_inventory to load vars for managed_node3 11701 1727096147.94129: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096147.94136: Calling all_plugins_play to load vars for managed_node3 11701 1727096147.94138: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096147.94141: Calling groups_plugins_play to load vars for managed_node3 11701 1727096147.96280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096147.98096: done with get_vars() 11701 1727096147.98128: done getting variables 11701 1727096147.98494: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:55:47 -0400 (0:00:00.128) 0:00:31.949 ****** 11701 1727096147.98529: entering _queue_task() for managed_node3/shell 11701 1727096147.99319: worker is 1 (out of 1 available) 11701 1727096147.99331: exiting _queue_task() for managed_node3/shell 11701 1727096147.99343: done queuing things up, now waiting for results queue to drain 11701 1727096147.99344: waiting for pending results... 11701 1727096147.99683: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 11701 1727096147.99974: in run() - task 0afff68d-5257-a05c-c957-00000000056d 11701 1727096147.99985: variable 'ansible_search_path' from source: unknown 11701 1727096147.99988: variable 'ansible_search_path' from source: unknown 11701 1727096147.99991: calling self._execute() 11701 1727096148.00010: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.00017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.00033: variable 'omit' from source: magic vars 11701 1727096148.00426: variable 'ansible_distribution_major_version' from source: facts 11701 1727096148.00439: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096148.00640: variable 'omit' from source: magic vars 11701 1727096148.00644: variable 'omit' from source: magic vars 11701 1727096148.00646: variable 'omit' from source: magic vars 11701 1727096148.00648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096148.00651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096148.00679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096148.00699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096148.00712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096148.00745: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096148.00749: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.00751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.00872: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096148.00883: Set connection var ansible_timeout to 10 11701 1727096148.00890: Set connection var ansible_shell_type to sh 11701 1727096148.00897: Set connection var ansible_shell_executable to /bin/sh 11701 1727096148.00900: Set connection var ansible_connection to ssh 11701 1727096148.00909: Set connection var ansible_pipelining to False 11701 1727096148.00934: variable 'ansible_shell_executable' from source: unknown 11701 1727096148.00937: variable 'ansible_connection' from source: unknown 11701 1727096148.00940: variable 'ansible_module_compression' from source: unknown 11701 1727096148.00942: variable 'ansible_shell_type' from source: unknown 11701 1727096148.00945: variable 'ansible_shell_executable' from source: unknown 11701 1727096148.00947: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.00949: variable 'ansible_pipelining' from source: unknown 11701 1727096148.00951: variable 'ansible_timeout' from source: unknown 11701 1727096148.00960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.01127: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096148.01146: variable 'omit' from source: magic vars 11701 1727096148.01149: starting attempt loop 11701 1727096148.01151: running the handler 11701 1727096148.01159: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096148.01181: _low_level_execute_command(): starting 11701 1727096148.01190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096148.02077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.02102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.02132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.02207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.03920: stdout chunk (state=3): >>>/root <<< 11701 1727096148.04014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.04178: stderr chunk (state=3): >>><<< 11701 1727096148.04292: stdout chunk (state=3): >>><<< 11701 1727096148.04296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.04298: _low_level_execute_command(): starting 11701 1727096148.04301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482 `" && echo ansible-tmp-1727096148.0421808-13242-211333022005482="` echo /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482 `" ) && sleep 0' 11701 1727096148.04977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.05061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.05100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.07101: stdout chunk (state=3): >>>ansible-tmp-1727096148.0421808-13242-211333022005482=/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482 <<< 11701 1727096148.07255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.07259: stdout chunk (state=3): >>><<< 11701 1727096148.07261: stderr chunk (state=3): >>><<< 11701 1727096148.07473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096148.0421808-13242-211333022005482=/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.07477: variable 'ansible_module_compression' from source: unknown 11701 1727096148.07479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096148.07482: variable 'ansible_facts' from source: unknown 11701 1727096148.07506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py 11701 1727096148.07733: Sending initial data 11701 1727096148.07736: Sent initial data (156 bytes) 11701 1727096148.08355: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096148.08382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.08493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.08510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.08528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.08552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.08706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.10274: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11701 1727096148.10286: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11701 1727096148.10323: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11701 1727096148.10331: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096148.10382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096148.10437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpw5ndgt6f /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py <<< 11701 1727096148.10440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py" <<< 11701 1727096148.10490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpw5ndgt6f" to remote "/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py" <<< 11701 1727096148.11216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.11287: stderr chunk (state=3): >>><<< 11701 1727096148.11363: stdout chunk (state=3): >>><<< 11701 1727096148.11376: done transferring module to remote 11701 1727096148.11392: _low_level_execute_command(): starting 11701 1727096148.11402: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/ /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py && sleep 0' 11701 1727096148.12083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096148.12095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.12107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096148.12133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096148.12148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096148.12242: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.12270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.12291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.12367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.14306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.14318: stdout chunk (state=3): >>><<< 11701 1727096148.14340: stderr chunk (state=3): >>><<< 11701 1727096148.14370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.14381: _low_level_execute_command(): starting 11701 1727096148.14473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/AnsiballZ_command.py && sleep 0' 11701 1727096148.15095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096148.15117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.15237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.15298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.15329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.32501: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3332sec preferred_lft 3332sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:55:48.311014", "end": "2024-09-23 08:55:48.320059", "delta": "0:00:00.009045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096148.34192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096148.34229: stderr chunk (state=3): >>><<< 11701 1727096148.34232: stdout chunk (state=3): >>><<< 11701 1727096148.34275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3332sec preferred_lft 3332sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:55:48.311014", "end": "2024-09-23 08:55:48.320059", "delta": "0:00:00.009045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096148.34318: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096148.34774: _low_level_execute_command(): starting 11701 1727096148.34780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096148.0421808-13242-211333022005482/ > /dev/null 2>&1 && sleep 0' 11701 1727096148.35580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.35596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.35654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.35673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.36086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.36155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.38092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.38111: stderr chunk (state=3): >>><<< 11701 1727096148.38119: stdout chunk (state=3): >>><<< 11701 1727096148.38141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.38155: handler run complete 11701 1727096148.38200: Evaluated conditional (False): False 11701 1727096148.38290: attempt loop complete, returning result 11701 1727096148.38298: _execute() done 11701 1727096148.38305: dumping result to json 11701 1727096148.38315: done dumping result, returning 11701 1727096148.38327: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0afff68d-5257-a05c-c957-00000000056d] 11701 1727096148.38335: sending task result for task 0afff68d-5257-a05c-c957-00000000056d ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009045", "end": "2024-09-23 08:55:48.320059", "rc": 0, "start": "2024-09-23 08:55:48.311014" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3332sec preferred_lft 3332sec inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11701 1727096148.38605: no more pending results, returning what we have 11701 1727096148.38609: results queue empty 11701 1727096148.38610: checking for any_errors_fatal 11701 1727096148.38612: done checking for any_errors_fatal 11701 1727096148.38612: checking for max_fail_percentage 11701 1727096148.38614: done checking for max_fail_percentage 11701 1727096148.38615: checking to see if all hosts have failed and the running result is not ok 11701 1727096148.38616: done checking to see if all hosts have failed 11701 1727096148.38616: getting the remaining hosts for this loop 11701 1727096148.38618: done getting the remaining hosts for this loop 11701 1727096148.38621: getting the next task for host managed_node3 11701 1727096148.38629: done getting next task for host managed_node3 11701 1727096148.38632: ^ task is: TASK: Verify DNS and network connectivity 11701 1727096148.38636: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11701 1727096148.38642: getting variables 11701 1727096148.38643: in VariableManager get_vars() 11701 1727096148.38689: Calling all_inventory to load vars for managed_node3 11701 1727096148.38691: Calling groups_inventory to load vars for managed_node3 11701 1727096148.38694: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096148.38705: Calling all_plugins_play to load vars for managed_node3 11701 1727096148.38708: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096148.38710: Calling groups_plugins_play to load vars for managed_node3 11701 1727096148.39373: done sending task result for task 0afff68d-5257-a05c-c957-00000000056d 11701 1727096148.39377: WORKER PROCESS EXITING 11701 1727096148.40463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096148.42123: done with get_vars() 11701 1727096148.42165: done getting variables 11701 1727096148.42230: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:55:48 -0400 (0:00:00.437) 0:00:32.387 ****** 11701 1727096148.42275: entering _queue_task() for managed_node3/shell 11701 1727096148.42673: worker is 1 (out of 1 available) 11701 1727096148.42782: exiting _queue_task() for managed_node3/shell 11701 1727096148.42798: done queuing things up, now waiting for results queue to drain 11701 1727096148.42799: waiting for pending results... 11701 1727096148.43025: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 11701 1727096148.43165: in run() - task 0afff68d-5257-a05c-c957-00000000056e 11701 1727096148.43188: variable 'ansible_search_path' from source: unknown 11701 1727096148.43195: variable 'ansible_search_path' from source: unknown 11701 1727096148.43242: calling self._execute() 11701 1727096148.43360: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.43374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.43389: variable 'omit' from source: magic vars 11701 1727096148.43777: variable 'ansible_distribution_major_version' from source: facts 11701 1727096148.43797: Evaluated conditional (ansible_distribution_major_version != '6'): True 11701 1727096148.43946: variable 'ansible_facts' from source: unknown 11701 1727096148.44779: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11701 1727096148.44795: variable 'omit' from source: magic vars 11701 1727096148.44849: variable 'omit' from source: magic vars 11701 1727096148.44905: variable 'omit' from source: magic vars 11701 1727096148.44975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11701 1727096148.45017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11701 1727096148.45047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11701 1727096148.45083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096148.45102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11701 1727096148.45187: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11701 1727096148.45191: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.45193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.45295: Set connection var ansible_module_compression to ZIP_DEFLATED 11701 1727096148.45298: Set connection var ansible_timeout to 10 11701 1727096148.45303: Set connection var ansible_shell_type to sh 11701 1727096148.45308: Set connection var ansible_shell_executable to /bin/sh 11701 1727096148.45310: Set connection var ansible_connection to ssh 11701 1727096148.45373: Set connection var ansible_pipelining to False 11701 1727096148.45376: variable 'ansible_shell_executable' from source: unknown 11701 1727096148.45378: variable 'ansible_connection' from source: unknown 11701 1727096148.45380: variable 'ansible_module_compression' from source: unknown 11701 1727096148.45382: variable 'ansible_shell_type' from source: unknown 11701 1727096148.45384: variable 'ansible_shell_executable' from source: unknown 11701 1727096148.45385: variable 'ansible_host' from source: host vars for 'managed_node3' 11701 1727096148.45387: variable 'ansible_pipelining' from source: unknown 11701 1727096148.45389: variable 'ansible_timeout' from source: unknown 11701 1727096148.45391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11701 1727096148.45545: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096148.45566: variable 'omit' from source: magic vars 11701 1727096148.45578: starting attempt loop 11701 1727096148.45585: running the handler 11701 1727096148.45599: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11701 1727096148.45626: _low_level_execute_command(): starting 11701 1727096148.45643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11701 1727096148.46505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.46531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.46581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.46615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.48330: stdout chunk (state=3): >>>/root <<< 11701 1727096148.48506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.48510: stdout chunk (state=3): >>><<< 11701 1727096148.48513: stderr chunk (state=3): >>><<< 11701 1727096148.48516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.48575: _low_level_execute_command(): starting 11701 1727096148.48580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472 `" && echo ansible-tmp-1727096148.4851644-13271-222583190538472="` echo /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472 `" ) && sleep 0' 11701 1727096148.49113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096148.49117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.49133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096148.49177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096148.49186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.49285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.49337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.51322: stdout chunk (state=3): >>>ansible-tmp-1727096148.4851644-13271-222583190538472=/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472 <<< 11701 1727096148.51429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.51460: stderr chunk (state=3): >>><<< 11701 1727096148.51463: stdout chunk (state=3): >>><<< 11701 1727096148.51488: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096148.4851644-13271-222583190538472=/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.51514: variable 'ansible_module_compression' from source: unknown 11701 1727096148.51555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11701ub3f79xu/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11701 1727096148.51778: variable 'ansible_facts' from source: unknown 11701 1727096148.51783: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py 11701 1727096148.51865: Sending initial data 11701 1727096148.51880: Sent initial data (156 bytes) 11701 1727096148.52278: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096148.52284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 11701 1727096148.52290: stderr chunk (state=3): >>>debug2: match not found <<< 11701 1727096148.52319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.52322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.52324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 11701 1727096148.52327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.52388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.52391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.52393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.52433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.54096: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11701 1727096148.54115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11701 1727096148.54148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjez4jaw_ /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py <<< 11701 1727096148.54163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py" <<< 11701 1727096148.54184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11701 1727096148.54189: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11701ub3f79xu/tmpjez4jaw_" to remote "/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py" <<< 11701 1727096148.54652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.54701: stderr chunk (state=3): >>><<< 11701 1727096148.54706: stdout chunk (state=3): >>><<< 11701 1727096148.54730: done transferring module to remote 11701 1727096148.54739: _low_level_execute_command(): starting 11701 1727096148.54744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/ /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py && sleep 0' 11701 1727096148.55173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11701 1727096148.55181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11701 1727096148.55203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.55211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11701 1727096148.55213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.55266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.55277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.55306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096148.57188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096148.57217: stderr chunk (state=3): >>><<< 11701 1727096148.57221: stdout chunk (state=3): >>><<< 11701 1727096148.57236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096148.57239: _low_level_execute_command(): starting 11701 1727096148.57244: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/AnsiballZ_command.py && sleep 0' 11701 1727096148.57917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096148.57990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 11701 1727096148.58008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096148.58034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096148.58117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096149.07081: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3766 0 --:--:-- --:--:-- --:--:-- 3812\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1359", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:55:48.738439", "end": "2024-09-23 08:55:49.067529", "delta": "0:00:00.329090", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11701 1727096149.08816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 11701 1727096149.08830: stderr chunk (state=3): >>><<< 11701 1727096149.08854: stdout chunk (state=3): >>><<< 11701 1727096149.09016: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3766 0 --:--:-- --:--:-- --:--:-- 3812\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1359", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:55:48.738439", "end": "2024-09-23 08:55:49.067529", "delta": "0:00:00.329090", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 11701 1727096149.09027: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11701 1727096149.09030: _low_level_execute_command(): starting 11701 1727096149.09033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096148.4851644-13271-222583190538472/ > /dev/null 2>&1 && sleep 0' 11701 1727096149.09623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11701 1727096149.09684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11701 1727096149.09766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 11701 1727096149.09790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11701 1727096149.09866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11701 1727096149.11810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11701 1727096149.11815: stdout chunk (state=3): >>><<< 11701 1727096149.11817: stderr chunk (state=3): >>><<< 11701 1727096149.11974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11701 1727096149.11978: handler run complete 11701 1727096149.11981: Evaluated conditional (False): False 11701 1727096149.11983: attempt loop complete, returning result 11701 1727096149.11985: _execute() done 11701 1727096149.11987: dumping result to json 11701 1727096149.11989: done dumping result, returning 11701 1727096149.11991: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0afff68d-5257-a05c-c957-00000000056e] 11701 1727096149.11993: sending task result for task 0afff68d-5257-a05c-c957-00000000056e ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.329090", "end": "2024-09-23 08:55:49.067529", "rc": 0, "start": "2024-09-23 08:55:48.738439" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3766 0 --:--:-- --:--:-- --:--:-- 3812 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1359 11701 1727096149.12145: no more pending results, returning what we have 11701 1727096149.12150: results queue empty 11701 1727096149.12151: checking for any_errors_fatal 11701 1727096149.12162: done checking for any_errors_fatal 11701 1727096149.12163: checking for max_fail_percentage 11701 1727096149.12165: done checking for max_fail_percentage 11701 1727096149.12166: checking to see if all hosts have failed and the running result is not ok 11701 1727096149.12169: done checking to see if all hosts have failed 11701 1727096149.12170: getting the remaining hosts for this loop 11701 1727096149.12171: done getting the remaining hosts for this loop 11701 1727096149.12175: getting the next task for host managed_node3 11701 1727096149.12187: done getting next task for host managed_node3 11701 1727096149.12190: ^ task is: TASK: meta (flush_handlers) 11701 1727096149.12192: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096149.12203: getting variables 11701 1727096149.12205: in VariableManager get_vars() 11701 1727096149.12251: Calling all_inventory to load vars for managed_node3 11701 1727096149.12256: Calling groups_inventory to load vars for managed_node3 11701 1727096149.12260: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096149.12475: Calling all_plugins_play to load vars for managed_node3 11701 1727096149.12479: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096149.12484: Calling groups_plugins_play to load vars for managed_node3 11701 1727096149.13112: done sending task result for task 0afff68d-5257-a05c-c957-00000000056e 11701 1727096149.13116: WORKER PROCESS EXITING 11701 1727096149.14256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096149.15916: done with get_vars() 11701 1727096149.15949: done getting variables 11701 1727096149.16032: in VariableManager get_vars() 11701 1727096149.16048: Calling all_inventory to load vars for managed_node3 11701 1727096149.16050: Calling groups_inventory to load vars for managed_node3 11701 1727096149.16055: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096149.16061: Calling all_plugins_play to load vars for managed_node3 11701 1727096149.16063: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096149.16066: Calling groups_plugins_play to load vars for managed_node3 11701 1727096149.17307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096149.19059: done with get_vars() 11701 1727096149.19103: done queuing things up, now waiting for results queue to drain 11701 1727096149.19105: results queue empty 11701 1727096149.19106: checking for any_errors_fatal 11701 1727096149.19110: done checking for any_errors_fatal 11701 1727096149.19111: checking for max_fail_percentage 11701 1727096149.19112: done checking for max_fail_percentage 11701 1727096149.19113: checking to see if all hosts have failed and the running result is not ok 11701 1727096149.19113: done checking to see if all hosts have failed 11701 1727096149.19114: getting the remaining hosts for this loop 11701 1727096149.19115: done getting the remaining hosts for this loop 11701 1727096149.19118: getting the next task for host managed_node3 11701 1727096149.19122: done getting next task for host managed_node3 11701 1727096149.19124: ^ task is: TASK: meta (flush_handlers) 11701 1727096149.19125: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096149.19128: getting variables 11701 1727096149.19130: in VariableManager get_vars() 11701 1727096149.19154: Calling all_inventory to load vars for managed_node3 11701 1727096149.19157: Calling groups_inventory to load vars for managed_node3 11701 1727096149.19160: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096149.19165: Calling all_plugins_play to load vars for managed_node3 11701 1727096149.19170: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096149.19174: Calling groups_plugins_play to load vars for managed_node3 11701 1727096149.20505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096149.22175: done with get_vars() 11701 1727096149.22213: done getting variables 11701 1727096149.22318: in VariableManager get_vars() 11701 1727096149.22335: Calling all_inventory to load vars for managed_node3 11701 1727096149.22338: Calling groups_inventory to load vars for managed_node3 11701 1727096149.22340: Calling all_plugins_inventory to load vars for managed_node3 11701 1727096149.22345: Calling all_plugins_play to load vars for managed_node3 11701 1727096149.22347: Calling groups_plugins_inventory to load vars for managed_node3 11701 1727096149.22350: Calling groups_plugins_play to load vars for managed_node3 11701 1727096149.23544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11701 1727096149.25659: done with get_vars() 11701 1727096149.25698: done queuing things up, now waiting for results queue to drain 11701 1727096149.25700: results queue empty 11701 1727096149.25701: checking for any_errors_fatal 11701 1727096149.25703: done checking for any_errors_fatal 11701 1727096149.25703: checking for max_fail_percentage 11701 1727096149.25704: done checking for max_fail_percentage 11701 1727096149.25705: checking to see if all hosts have failed and the running result is not ok 11701 1727096149.25706: done checking to see if all hosts have failed 11701 1727096149.25706: getting the remaining hosts for this loop 11701 1727096149.25707: done getting the remaining hosts for this loop 11701 1727096149.25710: getting the next task for host managed_node3 11701 1727096149.25714: done getting next task for host managed_node3 11701 1727096149.25715: ^ task is: None 11701 1727096149.25716: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11701 1727096149.25717: done queuing things up, now waiting for results queue to drain 11701 1727096149.25718: results queue empty 11701 1727096149.25719: checking for any_errors_fatal 11701 1727096149.25719: done checking for any_errors_fatal 11701 1727096149.25720: checking for max_fail_percentage 11701 1727096149.25721: done checking for max_fail_percentage 11701 1727096149.25722: checking to see if all hosts have failed and the running result is not ok 11701 1727096149.25723: done checking to see if all hosts have failed 11701 1727096149.25725: getting the next task for host managed_node3 11701 1727096149.25727: done getting next task for host managed_node3 11701 1727096149.25728: ^ task is: None 11701 1727096149.25729: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=76 changed=2 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Monday 23 September 2024 08:55:49 -0400 (0:00:00.835) 0:00:33.223 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.31s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install dnsmasq --------------------------------------------------------- 1.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.00s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.87s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.84s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.82s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.76s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install pgrep, sysctl --------------------------------------------------- 0.74s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Get NM profile info ----------------------------------------------------- 0.52s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 ** TEST check polling interval ------------------------------------------ 0.50s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 ** TEST check IPv4 ------------------------------------------------------ 0.48s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.48s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 11701 1727096149.25897: RUNNING CLEANUP